Statistical Approaches to Fault Analysis in Multivariate Process Control
Ungar, Lyle H.
Statistical Approaches to Fault Analysis in Multivariate Process Control Richard D. De Veaux Lyle H Abstract After a brief review of some statistical approaches to multivariate process control, we present analysis or partial least squares to limit the number of latent variables to study. While using historical
Characterization of Nuclear Fuel using Multivariate Statistical Analysis
Robel, M; Robel, M; Robel, M; Kristo, M J; Kristo, M J
2007-11-27T23:59:59.000Z
Various combinations of reactor type and fuel composition have been characterized using principle components analysis (PCA) of the concentrations of 9 U and Pu isotopes in the 10 fuel as a function of burnup. The use of PCA allows the reduction of the 9-dimensional data (isotopic concentrations) into a 3-dimensional approximation, giving a visual representation of the changes in nuclear fuel composition with burnup. Real-world variation in the concentrations of {sup 234}U and {sup 236}U in the fresh (unirradiated) fuel was accounted for. The effects of reprocessing were also simulated. The results suggest that, 15 even after reprocessing, Pu isotopes can be used to determine both the type of reactor and the initial fuel composition with good discrimination. Finally, partial least squares discriminant analysis (PSLDA) was investigated as a substitute for PCA. Our results suggest that PLSDA is a better tool for this application where separation between known classes is most important.
Paris-Sud XI, Université de
VIBRATION-BASED HEALTH MONITORING APPROACH FOR COMPOSITE STRUCTURES USING MULTIVARIATE STATISTICAL makes Structural Health Monitoring (SHM) a must for such materials and structures. The development of a proper structural health monitoring system has a crucial importance for such structures because
An Application of Multivariate Statistical Analysis for Query-Driven Visualization
Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes; Joy, Kenneth I.
2010-03-01T23:59:59.000Z
Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.
Online tools for sequence retrieval and multivariate statistics in molecular biology
Thioulouse, Jean
Online tools for sequence retrieval and multivariate statistics in molecular biology Guy Perrière@biomserv.univlyon1.fr Keywords: WorldWide Web; Sequence data banks; Retrieval system; Multivariate analysis; Sequence analysis. * To whom reprint requests should be sent. #12; Abstract We have developed a World
Method of multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2004-01-06T23:59:59.000Z
A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).
Classical least squares multivariate spectral analysis
Haaland, David M. (Albuquerque, NM)
2002-01-01T23:59:59.000Z
An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M. (Albuquerque, NM)
2002-01-01T23:59:59.000Z
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23T23:59:59.000Z
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Glick, D.C.; Davis, A.
1984-07-01T23:59:59.000Z
The multivariate statistical techniques of correlation coefficients, factor analysis, and cluster analysis, implemented by computer programs, can be used to process a large data set and produce a summary of relationships between variables and between samples. These techniques were used to find relationships for data on the inorganic constituents of US coals. Three hundred thirty-five whole-seam channel samples from six US coal provinces were analyzed for inorganic variables. After consideration of the attributes of data expressed on ash basis and whole-coal basis, it was decided to perform complete statistical analyses on both data sets. Thirty variables expressed on whole-coal basis and twenty-six variables expressed on ash basis were used. For each inorganic variable, a frequency distribution histogram and a set of summary statistics was produced. These were subdivided to reveal the manner in which concentrations of inorganic constituents vary between coal provinces and between coal regions. Data collected on 124 samples from three stratigraphic groups (Pottsville, Monongahela, Allegheny) in the Appalachian region were studied using analysis of variance to determine degree of variability between stratigraphic levels. Most variables showed differences in mean values between the three groups. 193 references, 71 figures, 54 tables.
PIXE-quantified AXSIA : elemental mapping by multivariate spectral analysis.
Doyle, Barney Lee; Antolak, Arlyn J. (Sandia National Labs, Livermore, CA); Campbell, J. L. (University of Guelph, Guelph, ON, Canada); Ryan, C. G. (CSIRO Exploration and Mining Bayview Road, Clayton VIC, Australia); Provencio, Paula Polyak; Barrett, Keith E. (Primecore Systems, Albuquerque, NM,); Kotula, Paul Gabriel
2005-07-01T23:59:59.000Z
Automated, nonbiased, multivariate statistical analysis techniques are useful for converting very large amounts of data into a smaller, more manageable number of chemical components (spectra and images) that are needed to describe the measurement. We report the first use of the multivariate spectral analysis program AXSIA (Automated eXpert Spectral Image Analysis) developed at Sandia National Laboratories to quantitatively analyze micro-PIXE data maps. AXSIA implements a multivariate curve resolution technique that reduces the spectral image data sets into a limited number of physically realizable and easily interpretable components (including both spectra and images). We show that the principal component spectra can be further analyzed using conventional PIXE programs to convert the weighting images into quantitative concentration maps. A common elemental data set has been analyzed using three different PIXE analysis codes and the results compared to the cases when each of these codes is used to separately analyze the associated AXSIA principal component spectral data. We find that these comparisons are in good quantitative agreement with each other.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M. (Albuquerque, NM); Melgaard, David K. (Albuquerque, NM)
2005-07-26T23:59:59.000Z
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M. (Albuquerque, NM); Melgaard, David K. (Albuquerque, NM)
2005-01-11T23:59:59.000Z
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented classical least squares multivariate spectral analysis
Haaland, David M.; Melgaard, David K.
2004-02-03T23:59:59.000Z
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
On-line tools for sequence retrieval and multivariate statistics in molecular biology
Thioulouse, Jean
On-line tools for sequence retrieval and multivariate statistics in molecular biology Guy Perrière@biomserv.univ-lyon1.fr Keywords: World-Wide Web; Sequence data banks; Retrieval system; Multivariateanalysis; Sequence for browsing sequence collections structured under ACNUC format and for performing multivariate analyses
Stellar populations in $\\omega$ Centauri: a multivariate analysis
Fraix-Burnet, Didier
2015-01-01T23:59:59.000Z
We have performed multivariate statistical analyses of photometric and chemical abundance parameters of three large samples of stars in the globular cluster $\\omega$ Centauri. The statistical analysis of a sample of 735 stars based on seven chemical abundances with the method of Maximum Parsimony (cladistics) yields the most promising results: seven groups are found, distributed along three branches with distinct chemical, spatial and kinematical properties. A progressive chemical evolution can be traced from one group to the next, but also within groups, suggestive of an inhomogeneous chemical enrichment of the initial interstellar matter. The adjustment of stellar evolution models shows that the groups with metallicities [Fe/H]\\textgreater{}-1.5 are Helium-enriched, thus presumably of second generation. The spatial concentration of the groups increases with chemical evolution, except for two groups, which stand out in their other properties as well. The amplitude of rotation decreases with chemical evolutio...
Apparatus and system for multivariate spectral analysis
Keenan, Michael R. (Albuquerque, NM); Kotula, Paul G. (Albuquerque, NM)
2003-06-24T23:59:59.000Z
An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.
Improved permeability prediction using multivariate analysis methods
Xie, Jiang
2009-05-15T23:59:59.000Z
of tree regression and cross-validation. The third is multivariate adaptive regression splines. Three methods are tested and compared at two complex carbonate reservoirs in west Texas: Salt Creek Field Unit (SCFU) and North Robertson Unit (NRU). The result...
Statistical analysis in multispectral remote sensing
Albert, Walter Gerald
1970-01-01T23:59:59.000Z
and examines various statistical techniques for their usefulness in crop pre- diction. These techniques include multiple regression, discriminant analysis, and likelihood ratio tests. Other procedures employed in this paper are univsriate and multivariate... analysis of variance. Several transformations are performed on the data, sets in an attempt to increase accuracy for discrimination of crops, Conclu- sions of the work undertaken in this paper are presented, and recommendations are made for further...
Study of NGC 5128 Globular Clusters Under Multivariate Statistical Paradigm
Chattopadhyay, Asis Kumar; Davoust, Emmanuel; Mondal, Saptarshi; Sharina, Margarita
2009-01-01T23:59:59.000Z
An objective classification of the globular clusters of NGC 5128 has been carried out by using a model-based approach of cluster analysis. The set of observable parameters includes structural parameters, spectroscopically determined Lick indices and radial velocities from the literature. The optimum set of parameters for this type of analysis is selected through a modified technique of Principal Component Analysis, which differs from the classical one in the sense that it takes into consideration the effects of outliers present in the data. Then a mixture model based approach has been used to classify the globular clusters into groups. The efficiency of the techniques used is tested through the comparison of the misclassification probabilities with those obtained using the K-means clustering technique. On the basis of the above classification scheme three coherent groups of globular clusters have been found. We propose that the clusters of one group originated in the original cluster formation event that coin...
Multivariate Mathematical Morphology applied to Color Image Analysis
LefÃ¨vre, SÃ©bastien
Chapter 10 Multivariate Mathematical Morphology applied to Color Image Analysis 10.1. Introduction analysis framework, currently fully developed for both binary and gray-level images. Its popularity in the image processing community is mainly due to its rigorous mathematical foundation as well as its inherent
Efficient Bayesian multivariate fMRI analysis using a sparsifying spatio-temporal prior
Edinburgh, University of
Efficient Bayesian multivariate fMRI analysis using a sparsifying spatio-temporal prior Marcel A Available online 1 December 2009 Keywords: Multivariate analysis Bayesian inference Expectation propagation Laplace prior is introduced as a multivariate approach to the analysis of neuroimaging data. It is shown
Statistical analysis of aerosol species, trace gasses, and meteorology in Chicago
O'Brien, Timothy E.
in uncovering linear relationships between meteorology and air pollutants in Chicago and aided in determining possible pollutant sources. Keywords Atmospheric aerosols . Canonical correlation analysis . Chicago air pollution . Multivariate statistics . Principal component analysis . Trace gasses Introduction Many air
A Multivariate Time Series Method for Monte Carlo Reactor Analysis
Taro Ueki
2008-08-14T23:59:59.000Z
A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor.
Multivariate analysis of cross-hole georadar velocity and attenuation tomograms for aquifer zonation
Barrash, Warren
for characterizing heterogeneous alluvial aquifers. A multivariate statistical technique, known as k-means cluster in a well-studied alluvial aquifer. A comparison of the clustered tomographic section with well-log data
IEEE SIGNAL PROCESSING MAGAZINE 2013 1 Kernel Multivariate Analysis Framework for
Camps-Valls, Gustavo
IEEE SIGNAL PROCESSING MAGAZINE 2013 1 Kernel Multivariate Analysis Framework for Supervised in the literature collectively grouped under the field of Multivariate Analysis (MVA). This paper provides a uniform Correlation Analysis (CCA) and Orthonormalized PLS (OPLS), as well as their non- linear extensions derived
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 1 Learning Multivariate Distributions
Geman, Donald
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 1 Learning Multivariate for learning high-dimensional multivariate probability distributions from estimated marginals. The approach- tially well-adapted to small-sample learning, where the bias-variance trade-off makes it necessary
Sloshing in the LNG shipping industry: risk modelling through multivariate heavy-tail analysis
Sloshing in the LNG shipping industry: risk modelling through multivariate heavy-tail analysis In the liquefied natural gas (LNG) shipping industry, the phenomenon of slosh- ing can lead to the occurrence in the LNG shipping industry. KEYWORDS: Sloshing, multivariate heavy-tail distribution, asymptotic depen
Independent Statistics & Analysis
Annual Energy Outlook 2013 [U.S. Energy Information Administration (EIA)]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645 3,625 1,006 492 742 33 111 1,613 122 40Coal Stocks at1,066,688ElectricityLessApril 2015 Independent Statistics &
Multivariate Statistical Analysis of Water Chemistry in Evaluating the
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page onYou are now leaving Energy.gov You are now leaving Energy.gov You are being directed offOCHCO2:Introduction toManagement of the NationalPennsylvaniaTemperatureMultipurpose
Random matrix approach to multivariate categorical data analysis
Patil, Aashay
2015-01-01T23:59:59.000Z
Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow those from random matrix theory. We demonstrate this approach by applying it to the data of Indian general elections and sea level pressures in North Atlantic ocean.
MATH 437/598 Applied Multivariate Statistics Final Project Poster Session
approaches. The first using the k-means algorithm, second the L-method, and lastly the GAP statistic
Systematic wavelength selection for improved multivariate spectral analysis
Thomas, Edward V. (2828 Georgia NE., Albuquerque, NM 87110); Robinson, Mark R. (1603 Solano NE., Albuquerque, NM 87110); Haaland, David M. (809 Richmond Dr. SE., Albuquerque, NM 87106)
1995-01-01T23:59:59.000Z
Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.
Scatterplot3d an R package for Visualizing Multivariate Data
Gotelli, Nicholas J.
of multivariate data in a three dimensional space. R itself is"A Language and Environment for Statistical Comput Data. Journal of Statistical Software 8(11), 120. Abstract Scatterplot3d is an R package for the visualization of multivariate data in a three dimensional space. R is a "language for data analysis and graphics
Park, Jinyong (University of Arizona, Tucson, AZ); Balasingham, P. (University of Arizona, Tucson, AZ); McKenna, Sean Andrew; Kulatilake, Pinnaduwa H. S. W. (University of Arizona, Tucson, AZ)
2004-09-01T23:59:59.000Z
Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater pH value, presence of volcanic rocks and presence of hydrothermal alteration. Data available for each of these important geologic variables were used to perform directional variogram modeling and kriging to estimate values for each variable at 23949 centers of the chosen 1 km cell grid system that represents the Sengan region. These values formed complete geologic variable vectors at each of the 23,949 one km cell centers.
Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE
2008-01-01T23:59:59.000Z
Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.
Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel; Borek, Theodore Thaddeus, III
2010-09-01T23:59:59.000Z
Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that vary as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief that this multivariate analysis will enable superior differentiation capabilities. In addition, noise and system artifacts challenge the analysis of GC-MS data collected on lower cost equipment, ubiquitous in commercial laboratories. This research has the potential to affect many areas of analytical chemistry including materials analysis, medical testing, and environmental surveillance. It could also provide a method to measure adsorption parameters for chemical interactions on various surfaces by measuring desorption as a function of temperature for mixtures. We have presented results of a novel method for examining offgas products of a common PDMS material. Our method involves utilizing a stepped TD/GC-MS data acquisition scheme that may be almost totally automated, coupled with multivariate analysis schemes. This method of data generation and analysis can be applied to a number of materials aging and thermal degradation studies.
AstroStat - A VO Tool for Statistical Analysis
Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin
2015-01-01T23:59:59.000Z
AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...
STATISTICAL PHONE: 530.752.2361
Wang, Jane-Ling
. from 1995 to 1998, he developed further expertise in software develop- ment, statistical programming, analysis, programming, and interpretation. Since joining the Statistical Laboratory in 2005, he has R. Beran: multivariate regression, bootstrap meth- ods, statistics on manifolds, asymptotic theory P
Reichardt, Thomas A.; Timlin, Jerilyn Ann; Jones, Howland D. T.; Sickafoose, Shane M.; Schmitt, Randal L.
2010-09-01T23:59:59.000Z
Laser-induced fluorescence measurements of cuvette-contained laser dye mixtures are made for evaluation of multivariate analysis techniques to optically thick environments. Nine mixtures of Coumarin 500 and Rhodamine 610 are analyzed, as well as the pure dyes. For each sample, the cuvette is positioned on a two-axis translation stage to allow the interrogation at different spatial locations, allowing the examination of both primary (absorption of the laser light) and secondary (absorption of the fluorescence) inner filter effects. In addition to these expected inner filter effects, we find evidence that a portion of the absorbed fluorescence is re-emitted. A total of 688 spectra are acquired for the evaluation of multivariate analysis approaches to account for nonlinear effects.
Scientific Data Analysis via Statistical Learning
Geddes, Cameron Guy Robinson
observations and simulations. Statistical machine learning algorithms have enormous potential to provide data, and the analysis of hurricanes and tropical storms in climate simulations. #12;Supervised Learning for SupernovaScientific Data Analysis via Statistical Learning Raquel Romano romano at hpcrd dot lbl dot gov
Spectral compression algorithms for the analysis of very large multivariate images
Keenan, Michael R. (Albuquerque, NM)
2007-10-16T23:59:59.000Z
A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.
Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis
Wang, Feng, E-mail: fwang@unu.edu [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)
2013-11-15T23:59:59.000Z
Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies.
Spatial compression algorithm for the analysis of very large multivariate images
Keenan, Michael R. (Albuquerque, NM)
2008-07-15T23:59:59.000Z
A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.
Statistical analysis of correlated fossil fuel securities
Li, Derek Z
2011-01-01T23:59:59.000Z
Forecasting the future prices or returns of a security is extraordinarily difficult if not impossible. However, statistical analysis of a basket of highly correlated securities offering a cross-sectional representation of ...
Stratigraphic statistical curvature analysis techniques
Bengtson, C.A.; Ziagos, J.P.
1987-05-01T23:59:59.000Z
SCAT applies statistical techniques to dipmeter data to identify patterns of bulk curvature, determine transverse and longitudinal structural directions, and reconstruct cross sections and contour maps. STRAT-SCAT applies the same concepts to geometric interpretation of multistoried unimodal, bimodal, or trough-type cross-bedding and also to seismic stratigraphy-scale stratigraphic structures. Structural dip, which comprises the bulk of dipmeter data, is related to beds that (statistically) were deposited with horizontal attitudes; stratigraphic dip is related to beds that were deposited with preferentially oriented nonhorizontal attitudes or to beds that assumed such attitudes because of differential compaction. Stratigraphic dip generates local zones of departure from structural dip on special SCAT plots. The RMS (root-mean-square) of apparent structural dip is greatest in the (structural) T-direction and least in the perpendicular L-direction; the RMS of stratigraphic dip (measured with respect to structural dip) is greatest in the stratigraphic T*-direction and least in the stratigraphic L*-direction. Multistoried, cross-bedding appears on T*-plots as local zones of either greater scatter or statistically significant departure of stratigraphic median dip from structural dip. In contrast, the L*-plot (except for trough-type cross-bedding) is sensitive to cross-bedding. Seismic stratigraphy-scale depositional sequences are identified on Mercator dip versus azimuth plots and polar tangent plots as secondary cylindrical-fold patterns imposed on global structural patterns. Progradational sequences generate local cycloid-type patterns on T*-plots, and compactional sequences generate local cycloid-type patterns on T*-plots, and compactional sequences generate local half-cusp patterns. Both features, however, show only structural dip on L*-plots.
Statistical Design, Analysis and Graphics for the Guadalupe
Statistical Design, Analysis and Graphics for the Guadalupe River Assessment Technical Memoranda Science Center (2013). Statistical Design, Analysis and Graphics for the Guadalupe River Assessment
Meta-Analysis for Longitudinal Data Models using Multivariate Mixture Priors
West, Mike
of multivariate normals, accomodating population heterogeneity, out- liers and non-linearity in regression. First, the random e#11;ects model is a exible mixture of multivariate normals, accomodating population
STATISTICAL ANALYSIS OF PROTEIN FOLDING KINETICS
Dinner, Aaron
STATISTICAL ANALYSIS OF PROTEIN FOLDING KINETICS AARON R. DINNER New Chemistry Laboratory for Protein Folding: Advances in Chemical Physics, Volume 120. Edited by Richard A. Friesner. Series Editors Experimental and theoretical studies have led to the emergence of a unified general mechanism for protein
Li, Deyuan
Direct Journal of Multivariate Analysis journal homepage: www.elsevier.com/locate/jmva A note on tail dependence. Such a dependence structure is critical for various purposes, which include asset pricing, portfolio optimization random variable (Y, Z) with the same marginal distributions, TDC is defined as = lim u P Y > u|Z > u
Stanford University
Multivariate analysis and prediction of wind turbine response to varying wind field characteristics effects on wind turbines are essential not only for designing, but also for cost-efficiently managing wind, Universitätsstr. 150, 44780 Bochum, GERMANY; email: hartus@inf.bi.rub.de ABSTRACT Site-specific wind field
annihilation factor analysis: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
16 Correlated Bayesian Factor Analysis CiteSeer Summary: Factor analysis is a method in multivariate statistical analysis that can help scientists determine which variables to...
Cloud-Based Statistical Analysis from Users' Perspective Botong Huang
Yang, Jun
efficient statistical analysis programs requires tremendous expertise and effort. Most statisticians would much prefer programming in languages familiar to them, such as R and MATLAB, Copyright 2014 IEEECumulon: Cloud-Based Statistical Analysis from Users' Perspective Botong Huang Department
Wriggers, Willy
Vised Manuscript ReceiVed: September 3, 2008 A multivariate statistical theory, local feature analysis (LFA be described by harmonic potential wells. For most systems, the dimension n of this essential space is veryCoarse-Graining Protein Structures With Local Multivariate Features from Molecular Dynamics Zhiyong
On quantum statistics in data analysis
Dusko Pavlovic
2008-05-13T23:59:59.000Z
Originally, quantum probability theory was developed to analyze statistical phenomena in quantum systems, where classical probability theory does not apply, because the lattice of measurable sets is not necessarily distributive. On the other hand, it is well known that the lattices of concepts, that arise in data analysis, are in general also non-distributive, albeit for completely different reasons. In his recent book, van Rijsbergen argues that many of the logical tools developed for quantum systems are also suitable for applications in information retrieval. I explore the mathematical support for this idea on an abstract vector space model, covering several forms of data analysis (information retrieval, data mining, collaborative filtering, formal concept analysis...), and roughly based on an idea from categorical quantum mechanics. It turns out that quantum (i.e., noncommutative) probability distributions arise already in this rudimentary mathematical framework. We show that a Bell-type inequality must be satisfied by the standard similarity measures, if they are used for preference predictions. The fact that already a very general, abstract version of the vector space model yields simple counterexamples for such inequalities seems to be an indicator of a genuine need for quantum statistics in data analysis.
Non resonant transmission modelling with Statistical modal Energy distribution Analysis
Boyer, Edmond
be used as an alternative to Statistical Energy Analysis for describing subsystems with low modal overlap1 Non resonant transmission modelling with Statistical modal Energy distribution Analysis L. Maxit Capelle, F-69621 Villeurbanne Cedex, France Statistical modal Energy distribution Analysis (SmEdA) can
Statistical Hot Channel Analysis for the NBSR
Cuadra A.; Baek J.
2014-05-27T23:59:59.000Z
A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.
Statistics for Analysis of Experimental Data Catherine A. Peters
Peters, Catherine A.
Statistics for Analysis of Experimental Data Catherine A. Peters Department of Civil Engineering Processes Laboratory Manual S. E. Powers, Ed. AEESP, Champaign, IL 2001 1 #12;Statistics Princeton University Princeton, NJ 08544 Statistics is a mathematical tool for quantitative analysis of data
Characterization of Used Nuclear Fuel with Multivariate Analysis for Process Monitoring
Dayman, Kenneth J. [Univ. of Texas at Austin, TX (United States); Coble, Jamie B. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Orton, Christopher R. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
2014-01-01T23:59:59.000Z
The Multi-Isotope Process (MIP) Monitor combines gamma spectroscopy and multivariate analysis to detect anomalies in various process streams in a nuclear fuel reprocessing system. Measured spectra are compared to models of nominal behavior at each measurement location to detect unexpected changes in system behavior. In order to improve the accuracy and specificity of process monitoring, fuel characterization may be used to more accurately train subsequent models in a full analysis scheme. This paper presents initial development of a reactor-type classifier that is used to select a reactor-specific partial least squares model to predict fuel burnup. Nuclide activities for prototypic used fuel samples were generated in ORIGEN-ARP and used to investigate techniques to characterize used nuclear fuel in terms of reactor type (pressurized or boiling water reactor) and burnup. A variety of reactor type classification algorithms, including k-nearest neighbors, linear and quadratic discriminant analyses, and support vector machines, were evaluated to differentiate used fuel from pressurized and boiling water reactors. Then, reactor type-specific partial least squares models were developed to predict the burnup of the fuel. Using these reactor type-specific models instead of a model trained for all light water reactors improved the accuracy of burnup predictions. The developed classification and prediction models were combined and applied to a large dataset that included eight fuel assembly designs, two of which were not used in training the models, and spanned the range of the initial 235U enrichment, cooling time, and burnup values expected of future commercial used fuel for reprocessing. Error rates were consistent across the range of considered enrichment, cooling time, and burnup values. Average absolute relative errors in burnup predictions for validation data both within and outside the training space were 0.0574% and 0.0597%, respectively. The errors seen in this work are artificially low, because the models were trained, optimized, and tested on simulated, noise-free data. However, these results indicate that the developed models may generalize well to new data and that the proposed approach constitutes a viable first step in developing a fuel characterization algorithm based on gamma spectra.
Experimental control analysis of a fuel gas saturator. Final report. [Multivariable
Terwilliger, G.E.; Brower, A.S.; Baheti, R.S.; Smith, R.E.; Brown, D.H.
1985-01-01T23:59:59.000Z
The multivariable control of the clean fuel gas saturator of a coal gasification process has been demonstrated. First principle process models described the process dynamics from which linear models were generated and used for the actual control designs. The multivariable control was designed, its response to transients simulated and the controls were implemented in a computer controller for a fuel gas saturator. The test results obtained for the gas flow transients showed good correlation with the computer simulations, giving confidence in the ability of the simulation to predict the plant performance for other transients. In this study, both time and frequency domain multivariable design techniques were applied to provide the best possible design and to determine their relative effectiveness. No clear guidelines resulted; it appears that the selection may be made on the basis of personal preference, experience or the availability of computer-aided design tools, rather than inherent technical differences. This EPRI/GE fuel gas saturator control demonstration has shown that multivariable design techniques can be applied to a real process and that practical controls are developed. With suitable process models, presently available computer-aided control design software allows the control design, evaluation and implementation to be completed in a reasonable time period. The application of these techniques to power generation processes is recommended.
Seismic Attribute Analysis Using Higher Order Statistics
Greenidge, Janelle Candice
2009-05-15T23:59:59.000Z
Seismic data processing depends on mathematical and statistical tools such as convolution, crosscorrelation and stack that employ second-order statistics (SOS). Seismic signals are non-Gaussian and therefore contain information beyond SOS. One...
Multivariate analysis of exhaust emissions from heavy-duty diesel fuels
Sjoegren, M.; Ulf, R.; Li, H.; Westerholm, R. [Stockholm Univ. (Sweden)
1996-01-01T23:59:59.000Z
Particulate and gaseous exhaust emission phases from running 10 diesel fuels on two makes of heavy-duty diesel engines were analyzed with respect to 63 chemical descriptors. Measurements for one of the fuels were also made in the presence of an exhaust aftertreatment device. The variables included 28 polycyclic aromatic compounds (PAC), regulated pollutants (CO, HC, NO{sub x}, particles), and 19 other organic and inorganic exhaust emission components. Principal components analysis (PCA) was applied for the statistical exploration of the obtained data. In addition, relationships between chemical (12 variables) and physical (12 variables) parameters of the fuels to the exhaust emissions were derived using partial least squares (PLS) regression. Both PCA and PLS models were derived for the engine makes separately. The PCA showed that the most descriptive exhaust emission factors from these diesel fuels included fluoranthene as a representative of PAC, the regulated pollutants, sulfates, methylated pyrenes, and monoaromatics. Exhaust emissions were significantly decreased in the presence of an exhaust aftertreatment device. Both engine makes exhibited similar patterns of exhaust emissions. Discrepancies were observed for the exhaust emissions of CO{sub 2} and oil-derived soluble organic fractions, owing to differences in engine design. The PLS analysis showed a good correlation of exhaust emission of the regulated pollutants and PAC with the contents of PAC in the fuels and the fuel aromaticity. 41 refs., 6 figs., 6 tabs.
Statistical Analysis of Transient Cycle Test Results in a 40...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine...
Electricity Case: Statistical Analysis of Electric Power Outages
Wang, Hai
Electricity Case: Statistical Analysis of Electric Power Outages CREATE Report Jeffrey S. Simonoff: Statistical Analysis of Electric Power Outages CREATE Report July 26, 2005 Jeffrey S. Simonoff (NYU of the United States Department of Homeland Security. #12;0 Electricity Case, Report 3 Electricity Case
Multivariate Analysis from a Statistical Point of View K.S. Cranmer
Fernandez, Thomas
for the results of the search for the Standard Model Higgs boson at LEP [4]. The Neyman-Pearson theory (which we
analysis identifies susceptibility: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
(controls). Only statistically significant Zelikovsky, Alexander 11 Application of Multivariate Analysis to Identify Soil CiteSeer Summary: In the center of South America,...
Statistical analysis of Single Nucleotide Polymorphism microarrays in cancer
Paris-Sud XI, UniversitÃ© de
Statistical analysis of Single Nucleotide Polymorphism microarrays in cancer studies Pierre Neuvial Nucleotide Polymorphism (SNP) arrays. We define the copy number states formally, and show how Nucleotide Polymorphism microarrays in cancer studies
analysis sampling statistics: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
bounds point out a difficu... Blum, A S; Blum, Avrim; Yang, Ke 2003-01-01 39 Statistical Energy Analysis 2 Long. 3 Long. Energy Storage, Conversion and Utilization Websites...
Overview of multivariate methods and their application to studies of wildlife habitat
Shugart, H.H. Jr.
1980-01-01T23:59:59.000Z
Multivariate statistical techniques as methods of choice in analyzing habitat relations among animals have distinct advantages over competitive methodologies. These considerations, joined with a reduction in the cost of computer time, the increased availability of multivariate statistical packages, and an increased willingness on the part of ecologists to use mathematics and statistics as tools, have created an exponentially increasing interest in multivariate statistical methods over the past decade. It is important to note that the earliest multivariate statistical analyses in ecology did more than introduce a set of appropriate and needed methodologies to ecology. The studies emphasized different spatial and organizational scales from those typically emphasized in habitat studies. The new studies, that used multivariate methods, emphasized individual organisms' responses in a heterogeneous environment. This philosophical (and to some degree, methodological) emphasis on heterogeneity has led to a potential to predict the consequences of disturbances and management on wildlife habitat. One recent development in this regard has been the coupling of forest succession simulators with multivariate analysis of habitat to predict habitat availability under different timber management procedures.
A statistical analysis of lead concentrations in human lung samples
Stringer, Claude Allen
1973-01-01T23:59:59.000Z
A STATISTICAL ANALYSIS OF LEAD CONCENTRATIONS IN HUMAN LUNG SAMPLES A Thesis by CLAUDE ALLEN STRINGER, JR. Submitted to the Graduate College of Texas ASM University in partial fulfillment of the requirement for the degree of MASTER... OF SCIENCE May 1973 Major Subject: Chemistry A STATISTICAL ANALYSIS OF LEAD CONCENTRATIONS IN HUMAN LUNG SAMPLES A Thesis CLAUDE ALLEN STRINGER, JR. Approved as to style and content by: (Chairman of Commi e) (Head of Department) C Member (Memb...
An economic and statistical analysis of pecan prices
Hertel, Karlene Sharon
1979-01-01T23:59:59.000Z
AN ECONOMIC AND STATISTICAL ANALYSIS OP PECAN PRICES A Thesis by KARLENE SHARON HERTEL Submitted to the Graduate College of Texas A&M University in partial fulfillment of the requirement for the degree of NASTER OP SCIENCE August 1979 Maj... or Subject: Agricultural Economics AN ECONOMIC AND STATISTICAL ANALYSIS OF PECAN PRICES A Thesis by KARLENE SHARON HERTEL Approved as to style and content by: ( hairman of Committ ) ead of Depar ment) (Member) (Member) August 1979 ABSTRACT...
Special problems in statistical critical pert analysis
Robieux, Christian Claude
1978-01-01T23:59:59.000Z
the mathematical problem. The section 1. 3. 5 presents different kinds of remainder term and ex- plains the consequences of these results. But before we need to sum up the structure of the network by a certain kind of statistical relationship between the paths...". The definition of I is I = E( (T ? E(T)) (T ? E(T)) ' ) Since T = PX, we have I = P E( (X ? E(Z)) (X ? E(X)) ' )P' The independence of the x . ' s implies i E=Pdiag (o. . . a) P' 1''''' b (1. 1. 2) where o2 is the variance of the activity time T 1 As diag...
Independent Statistics & Analysis Drilling Productivity Report
Annual Energy Outlook 2013 [U.S. Energy Information Administration (EIA)]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645 3,625 1,006 492 742 33 111 1,613 122 40Coal Stocks at1,066,688ElectricityLessApril 2015 Independent Statistics
Course STAT 6348.501 Applied Multivariate Analysis Professor Robert Serfling
Serfling, Robert
Analysis. The Classification Problem: Discriminant Analysis, Support Vector Machines, Classification the professor's notes accompanied by handouts distributed by email and from library eBooks. Also, other sources). T 11/15 R 11/17 Classification: discriminant analysis; support vector machines. Support vector machines
Shapiro, Alex
StatisticalInstitute Towards a Unified Theory of Inequality Constrained Testing in Multivariate Analysis A. Shapiro Introduction Statisticalinferencefor equalityconstrainedproblemsin multivariateanalysisis well established properties of linear spaces. In particularit is meaningfulto consideran orthogonal
A statistical analysis of personnel contaminations in 200 Area facilities
Wagner, M.A.; Stoddard, D.H.
1983-05-18T23:59:59.000Z
This study determined the frequency statistics of personnel contaminations in 200 Area facilities. These statistics are utilized in probability calculations for contamination risks, and are part of an effort to provide reliable information for use in safety studies. Data for this analysis were obtained from the 200 Area and the Tritium Area Fault Tree Data Banks and were analyzed with the aid of the STATPAC computer code.
analysis randomized placebo-controlled: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Very large sample size data sets of displacement, unknown authors 2006-01-01 70 Math 566 Multivariate Statistical Analysis Course Description from Bulletin: Matrix algebra and...
Application of Exploratory Multivariate Analysis for Network Khaled Labib and V. Rao Vemuri
Vemuri, Rao
attacks. The k-means, hierarchical clustering, self-organizing maps, principal component analysis, a comparison of the performance, feature, graphical representation and applicability of each method is possible through different computer programs. In order to be able to do a meaningful comparison, several tools must
Multivariate Analysis of Longitudinal Ordinal Data with Mixed Eects Models, with Application to
to Clinical Outcomes in Osteoarthritis Celine Marielle Laont1,2, Marc Vandemeulebroecke3, Didier Concordet1 are used. Typically, four ordinal outcomes are measured in clinical trials, including the posture of a dog feature in clinical trials. However, the standard methods for data analysis use unidimen- sional models
Introduction to Statistical Linear Models Spring 2005
of multivariate data and in the language of matrices and vectors. Broad introduction to MATLAB/Octave, R (SSyllabus Introduction to Statistical Linear Models 960:577:01 Spring 2005 Instructor: Farid Statistical Analysis" Fifth edition, Prentice Hall, 2002. Other sources may be required and will be posted
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01T23:59:59.000Z
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Statistical Error analysis of Nucleon-Nucleon phenomenological potentials
R. Navarro Perez; J. E. Amaro; E. Ruiz Arriola
2014-06-10T23:59:59.000Z
Nucleon-Nucleon potentials are commonplace in nuclear physics and are determined from a finite number of experimental data with limited precision sampling the scattering process. We study the statistical assumptions implicit in the standard least squares fitting procedure and apply, along with more conventional tests, a tail sensitive quantile-quantile test as a simple and confident tool to verify the normality of residuals. We show that the fulfilment of normality tests is linked to a judicious and consistent selection of a nucleon-nucleon database. These considerations prove crucial to a proper statistical error analysis and uncertainty propagation. We illustrate these issues by analyzing about 8000 proton-proton and neutron-proton scattering published data. This enables the construction of potentials meeting all statistical requirements necessary for statistical uncertainty estimates in nuclear structure calculations.
Damping Estimation of Plates for Statistical Energy Analysis
Vatti, Kranthi
2011-06-01T23:59:59.000Z
.R.D.M. algorithm. Statistical Energy Analysis (S.E.A.), which is a natural extension of the Power Input Method, is used to evaluate coupling loss factors for two sets of plates, one set joined along a line and the other set joined at a point. Two alternative...
Statistical mechanical analysis of the dynamics of learning in perceptrons
Coolen, ACC "Ton"
with constant learning rate 2.5. Theory versus simulations 3. On-line learning: complete training setsStatistical mechanical analysis of the dynamics of learning in perceptrons C. W. H. MACE and A. C to analyse the dynamics of various classes of supervised learning rules in perceptrons. The character
Statistical Analysis of Protein Folding Kinetics Aaron R. Dinner
Dinner, Aaron
Statistical Analysis of Protein Folding Kinetics Aaron R. Dinner , Sung-Sau So ¡ , and Martin and theoretical studies over several years have led to the emergence of a unified general mechanism for protein folding that serves as a framework for the design and interpretation of research in this area [1
Recurrence time statistics: Versatile tools for genomic DNA sequence analysis
Gao, Jianbo
Recurrence time statistics: Versatile tools for genomic DNA sequence analysis Yinhe Cao1, Wen, and the genomes of many other organisms waiting to be sequenced, it has become increasingly important to develop from DNA sequences. One of the more important structures in a DNA se- quence is repeat-related. Often
Advanced Analysis Qualifying Examination Department of Mathematics and Statistics
Massachusetts at Amherst, University of
be a continuous increasing invertible function. Let µF and µF be the Lebesgue-Stieljes measures associated to FNAME: Advanced Analysis Qualifying Examination Department of Mathematics and Statistics University function or characteristic function of A. 2. If a measure is not specified, use Lebesgue measure on R
Water O?H Stretching Raman Signature for Strong Acid Monitoring via Multivariate Analysis
Casella, Amanda J.; Levitskaia, Tatiana G.; Peterson, James M.; Bryan, Samuel A.
2013-04-16T23:59:59.000Z
Spectroscopic techniques have been applied extensively for quantification and analysis of solution compositions. In addition to static measurements, these techniques have been implemented in flow systems providing real-time solution information. A distinct need exists for information regarding acid concentration as it affects extraction efficiency and selectivity of many separation processes. Despite of the seeming simplicity of the problem, no practical solution has been offered yet particularly for the large-scale schemes involving toxic streams such as highly radioactive nuclear wastes. Classic potentiometric technique is not amiable for on-line measurements in nuclear fuel reprocessing due to requirements of frequent calibration/maintenance and poor long-term stability in the aggressive chemical and radiation environments. In this work, the potential of using Raman spectroscopic measurements for on-line monitoring of strong acid concentration in the solutions relevant to the dissolved used fuel was investigated. The Raman water signature was monitored and recorded for nitric and hydrochloric acid solution systems of systematically varied chemical composition, ionic strength, and temperature. The generated Raman spectroscopic database was used to develop predictive chemometric models for the quantification of the acid concentration (H+), neodymium concentration (Nd3+), nitrate concentration (NO3-), density, and ionic strength. This approach was validated using a flow solvent extraction system.
Statistical fractal analysis of 25 young star clusters
Gregorio-Hetem, J; Santos-Silva, T; Fernandes, B
2015-01-01T23:59:59.000Z
A large sample of young stellar groups is analysed aiming to investigate their clustering properties and dynamical evolution. A comparison of the Q statistical parameter, measured for the clusters, with the fractal dimension estimated for the projected clouds shows that 52% of the sample has substructures and tends to follow the theoretically expected relation between clusters and clouds, according to calculations for artificial distribution of points. The fractal statistics was also compared to structural parameters revealing that clusters having radial density profile show a trend of parameter s increasing with mean surface stellar density. The core radius of the sample, as a function of age, follows a distribution similar to that observed in stellar groups of Milky Way and other galaxies. They also have dynamical age, indicated by their crossing time that is similar to unbound associations. The statistical analysis allowed us to separate the sample into two groups showing different clustering characteristi...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
of Experiments Reliability Bayesian Methods Statistical Computation Statistical Graphics and Visualization Analysis of Measurement Systems Projects Data Analysis System...
HistFitter software framework for statistical data analysis
M. Baak; G. J. Besjes; D. Cote; A. Koutsman; J. Lorenz; D. Short
2014-10-06T23:59:59.000Z
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple data models at once, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication-quality style through a simple command-line interface.
Lifetime statistics of quantum chaos studied by a multiscale analysis
Di Falco, A.; Krauss, T. F. [School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Fratalocchi, A. [PRIMALIGHT, Faculty of Electrical Engineering, Applied Mathematics and Computational Science, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900 (Saudi Arabia)
2012-04-30T23:59:59.000Z
In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.
Rubin, Herman [bio] Professor of Statistics and Mathematics PhD: University of Chicago 1948. Office: MATH 550; Phone: +1 765 49-46054; Email: ...
Statistical Analysis of Abnormal Electric Power Grid Behavior
Ferryman, Thomas A.; Amidan, Brett G.
2010-10-30T23:59:59.000Z
Pacific Northwest National Laboratory is developing a technique to analyze Phasor Measurement Unit data to identify typical patterns, atypical events and precursors to a blackout or other undesirable event. The approach combines a data-driven multivariate analysis with an engineering-model approach. The method identifies atypical events, provides a plane English description of the event, and the capability to use drill-down graphics for detailed investigations. The tool can be applied to the entire grid, individual organizations (e.g. TVA, BPA), or specific substations (e.g., TVA_CUMB). The tool is envisioned for (1) event investigations, (2) overnight processing to generate a Morning Report that characterizes the previous days activity with respect to previous activity over the previous 10-30 days, and (3) potentially near-real-time operation to support the grid operators. This paper presents the current status of the tool and illustrations of its application to real world PMU data collected in three 10-day periods in 2007.
Distributed Multivariate Regression Using Wavelet-based Collective Data Mining.
Kargupta, Hilol
an approach to the analysis of distributed, heterogeneous databases with distinct feature spacesDistributed Multivariate Regression Using Wavelet-based Collective Data Mining. Daryl E a method for distributed multivariate regression using wavelet- based Collective Data Mining (CDM
Tatiana G. Levitskaia; James M. Peterson; Emily L. Campbell; Amanda J. Casella; Dean R. Peterman; Samuel A. Bryan
2013-12-01T23:59:59.000Z
In liquid–liquid extraction separation processes, accumulation of organic solvent degradation products is detrimental to the process robustness, and frequent solvent analysis is warranted. Our research explores the feasibility of online monitoring of the organic solvents relevant to used nuclear fuel reprocessing. This paper describes the first phase of developing a system for monitoring the tributyl phosphate (TBP)/n-dodecane solvent commonly used to separate used nuclear fuel. In this investigation, the effect of extraction of nitric acid from aqueous solutions of variable concentrations on the quantification of TBP and its major degradation product dibutylphosphoric acid (HDBP) was assessed. Fourier transform infrared (FTIR) spectroscopy was used to discriminate between HDBP and TBP in the nitric acid-containing TBP/n-dodecane solvent. Multivariate analysis of the spectral data facilitated the development of regression models for HDBP and TBP quantification in real time, enabling online implementation of the monitoring system. The predictive regression models were validated using TBP/n-dodecane solvent samples subjected to high-dose external ?-irradiation. The predictive models were translated to flow conditions using a hollow fiber FTIR probe installed in a centrifugal contactor extraction apparatus, demonstrating the applicability of the FTIR technique coupled with multivariate analysis for the online monitoring of the organic solvent degradation products.
Levitskaia, Tatiana G.; Peterson, James M.; Campbell, Emily L.; Casella, Amanda J.; Peterman, Dean; Bryan, Samuel A.
2013-11-05T23:59:59.000Z
In liquid-liquid extraction separation processes, accumulation of organic solvent degradation products is detrimental to the process robustness and frequent solvent analysis is warranted. Our research explores feasibility of online monitoring of the organic solvents relevant to used nuclear fuel reprocessing. This paper describes the first phase of developing a system for monitoring the tributyl phosphate (TBP)/n-dodecane solvent commonly used to separate used nuclear fuel. In this investigation, the effect of extraction of nitric acid from aqueous solutions of variable concentrations on the quantification of TBP and its major degradation product dibutyl phosphoric acid (HDBP) was assessed. Fourier Transform Infrared Spectroscopy (FTIR) spectroscopy was used to discriminate between HDBP and TBP in the nitric acid-containing TBP/n-dodecane solvent. Multivariate analysis of the spectral data facilitated the development of regression models for HDBP and TBP quantification in real time, enabling online implementation of the monitoring system. The predictive regression models were validated using TBP/n-dodecane solvent samples subjected to the high dose external gamma irradiation. The predictive models were translated to flow conditions using a hollow fiber FTIR probe installed in a centrifugal contactor extraction apparatus demonstrating the applicability of the FTIR technique coupled with multivariate analysis for the online monitoring of the organic solvent degradation products.
Statistical analysis of cascading failures in power grids
Chertkov, Michael [Los Alamos National Laboratory; Pfitzner, Rene [Los Alamos National Laboratory; Turitsyn, Konstantin [Los Alamos National Laboratory
2010-12-01T23:59:59.000Z
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
Gerencher, J.J. Jr.
1983-01-01T23:59:59.000Z
Multivariate statistical techniques have been applied to study interrelationships among 12 variables within a set of 277 coals representing whole-seam channel, column, and core samples obtained from each of the 6 coal provinces of the United States, and varying in rank from lignite through anthracite. The data are maintained in computerized data base at The Pennsylvania State University Coal Research Section. The variables selected are components of the elemental analysis (carbon, oxygen, organic sulfur, hydrogen, and nitrogen), selected components of the proximate analysis (volatile matter and moisture), calorific value, reflectance of vitrinite, and the relative proportions of the 3 maceral groups (total vitrinite, inertinite, and liptinite group macerals). Faactor analyses performed on the entire data set and on subsets separated on the basis of rank, geographic location, and by cluster analysis indicated that rank is the most important factor in determining the amount of variation of each data set. The rank-dependent variables for the entire data set are carbon, reflectance, oxygen, volatile matter, calorific value, and moisture. The maceral groups account for the next greatest source of variation. Organic sulfur is independent of the first 2 factors and is the third most important source of variation. Cluster analyses indicated that the most significant partitioning produces 4 groups which are differentiated primarily on the basis of rank, maceral composition, and organic sulfur content. Factor analyses of the individual groups provide insights into the coalification processes of these more homogeneous coal associations.
10-10:50 am, Location Green Center 265 Web Page: The username and password for the website, economics, education, ecology, geology, sociology, energy, atmospheric sciences, law enforcement
Statistical analysis of test data for APM rod issue
Edwards, T.B.; Harris, S.P.; Reeve, C.P.
1992-05-01T23:59:59.000Z
The uncertainty associated with the use of the K-Reactor axial power monitors (APMs) to measure roof-top-ratios is investigated in this report. Internal heating test data acquired under both DC-flow conditions and AC-flow conditions have been analyzed. These tests were conducted to simulate gamma heating at the lower power levels planned for reactor operation. The objective of this statistical analysis is to investigate the relationship between the observed and true roof-top-ratio (RTR) values and associated uncertainties at power levels within this lower operational range. Conditional on a given, known power level, a prediction interval for the true RTR value corresponding to a new, observed RTR is given. This is done for a range of power levels. Estimates of total system uncertainty are also determined by combining the analog-to-digital converter uncertainty with the results from the test data.
Statistical Analysis Of Tank 5 Floor Sample Results
Shine, E. P.
2012-08-01T23:59:59.000Z
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their MDCs. The identification of distributions and the selection of UCL95 procedures generally followed the protocol in Singh, Armbya, and Singh [2010]. When all of an analyte's measurements lie below their MDCs, only a summary of the MDCs can be provided. The measurement results reported by SRNL are listed in Appendix A, and the results of this analysis are reported in Appendix B. The data were generally found to follow a normal distribution, and to be homogenous across composite samples.
STATISTICAL ANALYSIS OF TANK 5 FLOOR SAMPLE RESULTS
Shine, E.
2012-03-14T23:59:59.000Z
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, radionuclide, inorganic, and anion concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their MDCs. The identification of distributions and the selection of UCL95 procedures generally followed the protocol in Singh, Armbya, and Singh [2010]. When all of an analyte's measurements lie below their MDCs, only a summary of the MDCs can be provided. The measurement results reported by SRNL are listed in Appendix A, and the results of this analysis are reported in Appendix B. The data were generally found to follow a normal distribution, and to be homogeneous across composite samples.
Statistical Analysis of Tank 5 Floor Sample Results
Shine, E. P.
2013-01-31T23:59:59.000Z
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their MDCs. The identification of distributions and the selection of UCL95 procedures generally followed the protocol in Singh, Armbya, and Singh [2010]. When all of an analyte's measurements lie below their MDCs, only a summary of the MDCs can be provided. The measurement results reported by SRNL are listed, and the results of this analysis are reported. The data were generally found to follow a normal distribution, and to be homogenous across composite samples.
STATISTICAL AND 3D NONLINEAR FINITE ELEMENT ANALYSIS OF SCHLEGEIS DAM
Balaji, Rajagopalan
STATISTICAL AND 3D NONLINEAR FINITE ELEMENT ANALYSIS OF SCHLEGEIS DAM VICTOR SAOUMA, ERIC HANSEN is composed of two parts. First a statistical analysis of the dam crest displacement is performed, along with a prediction for the years 2000-2001. Then a 3D finite element analysis of Schlegeis dam is performed using
Montana, University of
of statistical costs associated with alternative survey scenarios. We used the program, TRENDS (Gerrodette 1987 with the definition of r); for variable 8, we assumed that the t statistic was most appropriate, given that we1 An analysis of the tradeoffs in statistical power among annual, biennial, and triennial landbird
Statistical static timing analysis considering process variations and crosstalk
Veluswami, Senthilkumar
2005-11-01T23:59:59.000Z
.................................................................................................. 9 D. Testable Paths .......................................................................................... 15 III. SOLUTION METHODOLOGY...................................................................... 17 A. Delay Model................................................................................................ 17 B. Process Variations.................................................................................... 17 C. Crosstalk .................................................................................................. 23 D. Statistical Timing...
Aires, Filipe
2003-01-01T23:59:59.000Z
to classical linear feedback analysis, we present a nonlinear approach for the determination. Such an approach is valid in a theoretical model where the instantaneous sensitivities can be evaluated directly of statistical estimates of all the pair-wise relationships among the system state variables based on a neural
Fienberg, Stephen E.
. FIENBERG, MICHAEL M. MEYER, and STANLEY S. WASSERMAN* Loglinear models are adapted for the analysisStatistical Analysis of Multiple Sociometric Relations Stephen E. Fienberg; Michael M. Meyer; Stanley S. Wasserman Journal of the American Statistical Association, Vol. 80, No. 389. (Mar., 1985), pp
A model of the statistical power of comparative genome sequence analysis
Eddy, Sean
A model of the statistical power of comparative genome sequence analysis Sean R. Eddy Howard Hughes genome sequence analysis is powerful, but sequencing genomes is expensive. It is desirable to be able to predict how many genomes are needed to achieve a particular statistical power in comparative analyses
Parallel and Statistical Analysis and Modeling of Nanometer VLSI Systems
Liu, Xue-Xin
2013-01-01T23:59:59.000Z
for reduced order analysis of linear circuit with multipleWorst case analysis of linear analog circuit performancelinear analog circuits under parameter variations by robust interval analysis.
Monolithic or hierarchical star formation? A new statistical analysis
Marios Kampakoglou; Roberto Trotta; Joe Silk
2007-11-23T23:59:59.000Z
We consider an analytic model of cosmic star formation which incorporates supernova feedback, gas accretion and enriched outflows, reproducing the history of cosmic star formation, metallicity, supernovae type II rates and the fraction of baryons allocated to structures. We present a new statistical treatment of the available observational data on the star formation rate and metallicity that accounts for the presence of possible systematics. We then employ a Bayesian Markov Chain Monte Carlo method to compare the predictions of our model with observations and derive constraints on the 7 free parameters of the model. We find that the dust correction scheme one chooses to adopt for the star formation data is critical in determining which scenario is favoured between a hierarchical star formation model, where star formation is prolonged by accretion, infall and merging, and a monolithic scenario, where star formation is rapid and efficient. We distinguish between these modes by defining a characteristic minimum mass, M > 10^{11} M_solar, in our fiducial model, for early type galaxies where star formation occurs efficiently. Our results indicate that the hierarchical star formation model can achieve better agreement with the data, but that this requires a high efficiency of supernova-driven outflows. In a monolithic model, our analysis points to the need for a mechanism that drives metal-poor winds, perhaps in the form of supermassive black hole-induced outflows. Furthermore, the relative absence of star formation beyond z ~ 5 in the monolithic scenario requires an alternative mechanism to dwarf galaxies for reionizing the universe at z ~ 11, as required by observations of the microwave background. While the monolithic scenario is less favoured in terms of its quality-of-fit, it cannot yet be excluded.
Application of statistical learning theory to plankton image analysis
Hu, Qiao, Ph. D. Massachusetts Institute of Technology
2006-01-01T23:59:59.000Z
A fundamental problem in limnology and oceanography is the inability to quickly identify and map distributions of plankton. This thesis addresses the problem by applying statistical machine learning to video images collected ...
Statistical analysis and transfer of coarse-grain pictorial style
Bae, Soonmin
2005-01-01T23:59:59.000Z
We show that image statistics can be used to analyze and transfer simple notions of pictorial style of paintings and photographs. We characterize the frequency content of pictorial styles, such as multi-scale, spatial ...
Kockelman, Kara M.
(fatal or incapacitating) and non-severe crash rates reflects latent covariates that have impacts across and William J. Murray Jr. Fellow Department of Civil, Architectural and Environmental Engineering in Accident Analysis & Prevention ABSTRACT This work examines the relationship between 3-year pedestrian crash
Statistical analysis of large-scale structure in the Universe
Martin Kerscher
1999-12-15T23:59:59.000Z
Methods for the statistical characterization of the large-scale structure in the Universe will be the main topic of the present text. The focus is on geometrical methods, mainly Minkowski functionals and the J-function. Their relations to standard methods used in cosmology and spatial statistics and their application to cosmological datasets will be discussed. This work is not only meant as a short review for comologist, but also attempts to illustrate these morphological methods and to make them accessible to scientists from other fields. Consequently, a short introduction to the standard picture of cosmology is given.
West, Mike
classification, validation, prognosis Binary regression models · Linear regression model based on regression Standard statistical models transform from real-value to (0, 1) using a specified non-linear functionStatistics & Gene Expression Data Analysis Note 8: Binary Regression Outcomes and classification
UNDERSTANDING MANUFACTURING ENERGY USE THROUGH STATISTICAL ANALYSIS KELLY KISSOCK AND JOHN SERYAK
Kissock, Kelly
, OHIO ABSTRACT Energy in manufacturing facilities is used for direct production of goods, spaceUNDERSTANDING MANUFACTURING ENERGY USE THROUGH STATISTICAL ANALYSIS KELLY KISSOCK AND JOHN SERYAK for statistically analyzing plant energy use in terms of these major end uses. The methodology uses as few as 60
Smyth, Gordon K.
Statistics in Microarray Analysis 111 111 From: Methods in Molecular Biology: vol. 224: Functional, NJ 9 Statistical Issues in cDNA Microarray Data Analysis Gordon K. Smyth,Yee Hwa Yang, and Terry Speed 1. Introduction Statistical considerations are frequently to the fore in the analysis
Fazzio, Thomas J. (Thomas Joseph)
2010-01-01T23:59:59.000Z
This paper attempts to understand the price dynamics of the North American natural gas market through a statistical survey that includes an analysis of the variables influencing the price and volatility of this energy ...
ANALYSIS OF STATISTICS FOR GENERALIZED STIRLING PERMUTATIONS MARKUS KUBA AND ALOIS PANHOLZER
Panholzer, Alois
ANALYSIS OF STATISTICS FOR GENERALIZED STIRLING PERMUTATIONS MARKUS KUBA AND ALOIS PANHOLZER ABSTRACT. In this work we give a study of generalizations of Stirling permutations, a restricted class between such generalized Stirling permutations and various families of increasing trees extending
University of Illinois at Chicago; Montana State University; Bhardwaj, Chhavi; Cui, Yang; Hofstetter, Theresa; Liu, Suet Yi; Bernstein, Hans C.; Carlson, Ross P.; Ahmed, Musahid; Hanley, Luke
2013-04-01T23:59:59.000Z
7.87 to 10.5 eV vacuum ultraviolet (VUV) photon energies were used in laser desorption postionization mass spectrometry (LDPI-MS) to analyze biofilms comprised of binary cultures of interacting microorganisms. The effect of photon energy was examined using both tunable synchrotron and laser sources of VUV radiation. Principal components analysis (PCA) was applied to the MS data to differentiate species in Escherichia coli-Saccharomyces cerevisiae coculture biofilms. PCA of LDPI-MS also differentiated individual E. coli strains in a biofilm comprised of two interacting gene deletion strains, even though these strains differed from the wild type K-12 strain by no more than four gene deletions each out of approximately 2000 genes. PCA treatment of 7.87 eV LDPI-MS data separated the E. coli strains into three distinct groups two ?pure? groups and a mixed region. Furthermore, the ?pure? regions of the E. coli cocultures showed greater variance by PCA when analyzed by 7.87 eV photon energies than by 10.5 eV radiation. Comparison of the 7.87 and 10.5 eV data is consistent with the expectation that the lower photon energy selects a subset of low ionization energy analytes while 10.5 eV is more inclusive, detecting a wider range of analytes. These two VUV photon energies therefore give different spreads via PCA and their respective use in LDPI-MS constitute an additional experimental parameter to differentiate strains and species.
Parallel and Statistical Analysis and Modeling of Nanometer VLSI Systems
Liu, Xue-Xin
2013-01-01T23:59:59.000Z
layers. Advanced cooling techniques such as integratedtraditional fan-based cooling techniques are not sufficientcooling problems. Fast and accurate thermal analysis techniques
SACI: Statistical Static Timing Analysis of Coupled Interconnects
Pedram, Massoud
in the circuit timing that stem from various sources of variations. However, static timing analysis (STA crosstalk effects in these circuits. As a result, crosstalk analysis and management have been classified line as a linear function of random variables and then use these r.v.'s to compute the circuit mo
Statistical analysis of sampling methods in quantum tomography
Thomas Kiesel
2012-06-07T23:59:59.000Z
In quantum physics, all measured observables are subject to statistical uncertainties, which arise from the quantum nature as well as the experimental technique. We consider the statistical uncertainty of the so-called sampling method, in which one estimates the expectation value of a given observable by empirical means of suitable pattern functions. We show that if the observable can be written as a function of a single directly measurable operator, the variance of the estimate from the sampling method equals to the quantum mechanical one. In this sense, we say that the estimate is on the quantum mechanical level of uncertainty. In contrast, if the observable depends on non-commuting operators, e.g. different quadratures, the quantum mechanical level of uncertainty is not achieved. The impact of the results on quantum tomography is discussed, and different approaches to quantum tomographic measurements are compared. It is shown explicitly for the estimation of quasiprobabilities of a quantum state, that balanced homodyne tomography does not operate on the quantum mechanical level of uncertainty, while the unbalanced homodyne detection does.
Indoor air quality: multivariate analyses of the relationship between indoor and outdoor aerosols
McCarthy, S.M.
1986-01-01T23:59:59.000Z
A unique multivariate data set incorporating simultaneous indoor and outdoor measurements of sixteen air contaminants at ten homes has been used to investigate the contribution of outdoor concentrations to indoor aerosol variability, and to characterize indoor source contribution to the indoor concentrations. The data were available from an earlier field study of particle and gas concentrations outside and inside five homes in each of two cities: Portage, Wisconsin, and Steubenville, Ohio. Three distinct multivariate statistical techniques were used sequentially in the research, successively building on the results and interpretations as they developed. Cluster analysis was selected as the initial method for partitioning the variables into subgroups comprised of highly intercorrelated variables. Significant site-to-site variability was evident in both cities, however within sites, indoor clusters had similarities to the outdoor clusters. Principal component analysis was next performed on the Portage data, reduced in dimension to avoid problems of singularity in the data matrix. The principal component analyses results were used to attribute predominant indoor and outdoor sources, including cigarette smoke, wood stove, road dust, and urban combustion sources. Finally, multiple regression analysis was performed to relate outdoor pollutant concentrations to a composite index of the indoor aerosol as represented by the orthogonal rotations of the indoor principal components. The research indicates that this multivariate analysis framework is preferable to single univariate analysis in evaluating the influence of outdoor aerosols and indoor sources on indoor air quality data.
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM ERIC KOLACZYK Department of Statistics Boston University Statistical Analysis of Network Data: (Re)visiting the Foundations MONDAY, October 13, 2014, at 4, statistical methods and modeling have been central to these efforts. But how well do we truly understand
Identification of faults in a multivariate process with Bayesian network
Boyer, Edmond
control charts in a Bayesian network. Thus, in the context of multivariate processes, we propose network. Key words: Multivariate SPC, T2 decomposition, Bayesian network 1. Introduction Nowadays (i.e. control charts, methods based on Principal Component Analysis, Projection to Latent Structure
Lawrence, Rick L.
Statistics programs1 teach individuals to apply mathematical principles to the collection, analysis and statistical knowledge to the design of surveys and experiments; collection, processing, and analysis of data; and interpretation of the results. Statisticians may apply their knowledge of statistical methods to a variety
ECOGRAPHY 25: 553557, 2002 Integrating the statistical analysis of spatial data in ecology
Liebhold, Andrew
ECOGRAPHY 25: 553557, 2002 Integrating the statistical analysis of spatial data in ecology A. M of spatial data in ecology. Ecography 25: 553557. In many areas of ecology there is an increasing emphasis for the analysis of spatial data has yielded considerable insight into various ecological problems, this diversity
A Model of the Statistical Power of Comparative Genome Sequence Analysis
Eddy, Sean
A Model of the Statistical Power of Comparative Genome Sequence Analysis Sean R. Eddy Howard Hughes, Missouri, United States of America Comparative genome sequence analysis is powerful, but sequencing genomes is expensive. It is desirable to be able to predict how many genomes are needed for comparative genomics
Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.
1981-01-01T23:59:59.000Z
A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.
Statistical language analysis for automatic exfiltration event detection.
Robinson, David Gerald
2010-04-01T23:59:59.000Z
This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.
U.S. Energy Information Administration Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level:Energy: Grid Integration Redefining What'sis Taking Over OurThe Iron Spin Transition in2,EHSS A-ZandofpointDOEFeb.Analysis &New York
U.S. Energy Information Administration Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level:Energy: Grid Integration Redefining What'sis Taking Over OurThe Iron Spin Transition in2,EHSS A-ZandofpointDOEFeb.Analysis &New YorkJune
U.S. Energy Information Administration Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level:Energy: Grid Integration Redefining What'sis Taking Over OurThe Iron Spin Transition in2,EHSS A-ZandofpointDOEFeb.Analysis &New
DOE/OR-1066R5/02-03 7-1 7. DATA ANALYSIS AND STATISTICAL TREATMENT
Pennycook, Steve
DOE/OR-1066R5/02-03 7-1 7. DATA ANALYSIS AND STATISTICAL TREATMENT 7.1 INTRODUCTION Four goals for analysis and statistical treatment of data are identified in Chap. 7 of the Regulatory Guide. These goals and nonradiological monitoring programs through a general discussion of the DQO process, data analysis
1 Introduction Towards Better Graphics for Multivariate Anal-
Thioulouse, Jean
1 Introduction Summary Keywords: Towards Better Graphics for Multivariate Anal- ysis, interactive graphics, factor map, principal component analysis Dynamic graphics are gaining more and more popularity with the availabil- ity of powerfull microcomputers and user-friendly graphical interfaces
Swan II, J. Edward
Guided Analysis of Hurricane Trends Using Statistical Processes Integrated with Interactive. The system's utility is demonstrated with an extensive hurricane climate study that was conducted by a hurricane expert. In the study, the expert used a new data set of environmental weather data, composed of 28
Predicting landfalling hurricane numbers from basin hurricane numbers: basic statistical analysis
Laepple, T; Penzer, J; Bellone, E; Nzerem, K; Laepple, Thomas; Jewson, Stephen; Penzer, Jeremy; Bellone, Enrica; Nzerem, Kechi
2007-01-01T23:59:59.000Z
One possible method for predicting landfalling hurricane numbers is to first predict the number of hurricanes in the basin and then convert that prediction to a prediction of landfalling hurricane numbers using an estimated proportion. Should this work better than just predicting landfalling hurricane numbers directly? We perform a basic statistical analysis of this question in the context of a simple abstract model.
PATHS: Analysis of PATH Duration Statistics and their Impact on Reactive MANET Routing Protocols
Krishnamachari, Bhaskar
PATHS: Analysis of PATH Duration Statistics and their Impact on Reactive MANET Routing Protocols Department of Electrical Engineering University of Southern California {narayans,fbai,bkrishna,helmy}@usc.edu ABSTRACT We develop a detailed approach to study how mobility im- pacts the performance of reactive MANET
Paris-Sud XI, UniversitÃ© de
with statistical analysis Anthony Ung, Laure Malherbe, Frederik Meleux, Bertrand Bessagnet, Laurence Rouil and MACCII modeling team INERIS institut, Paris, France Corresponding author: Anthony.ung@ineris.fr Abstract/QC dossiers and available on the MACC project website for each model. All models have also very significant
Statistical analysis of electric power production costs JORGE VALENZUELA and MAINAK MAZUMDAR*
Mazumdar, Mainak
whether the utility's own generators should be used to produce power or purchase from outside indeStatistical analysis of electric power production costs JORGE VALENZUELA and MAINAK MAZUMDAR be sucient production at all times to meet the demand for electric power. If a low-cost generating unit fails
Relative Apparent Synapomorphy Analysis (RASA) I: The Statistical Measurement of Phylogenetic Signal
inference, providing measurable sensitivity and power. The performance of RASA is examined under variousRelative Apparent Synapomorphy Analysis (RASA) I: The Statistical Measurement of PhylogeneticUSDA Forest Service, Reno, Nevada We have developed a new approach to the measurement of phylogenetic signal
Threshold phenomena and complexity: a statistical physics analysis of the random
Duxbury, Phillip M.
Threshold phenomena and complexity: a statistical physics analysis of the random Satis#12;ability problem. R#19;emi Monasson 1 Laboratoire de Physique Th#19;eorique de l'ENS, 75005 Paris. Abstract designed by physicists to deal with optimization or decision problems in an accessible language
A Prediction Method for Job Runtimes on Shared Processors: Survey, Statistical Analysis and New
van der Mei, Rob
A Prediction Method for Job Runtimes on Shared Processors: Survey, Statistical Analysis and New predictions of the expected computation times of those jobs on remote hosts. Currently, there are no effective prediction methods available that cope with the ever-changing running times of jobs on a grid environment
Statistical analysis of wind energy in Chile David Watts a,b,*, Danilo Jara a
Catholic University of Chile (Universidad CatÃ³lica de Chile)
Data Bank Statistical analysis of wind energy in Chile David Watts a,b,*, Danilo Jara December 2010 Keywords: Wind Wind speed Energy Capacity factor Electricity Chile a b s t r a c t Bearing has been remarkably influenced by new requirements e the search for new energy supply sources has
Statistical analysis of 4-year observations of aerosol sizes in a semi-rural continental environment
Lee, Shan-Hu
Statistical analysis of 4-year observations of aerosol sizes in a semi-rural continental. Introduction Formation of new aerosol particles via gas-to-particle conver- sion is an important process, which to understanding how new particle formation (NPF) processes lead to formation of cloud condensation nuclei (CCN
Statistical Methods for Enhanced Metrology in Semiconductor/Photovoltaic Manufacturing
Zeng, Dekong
2012-01-01T23:59:59.000Z
statistical process control (SPC) charts. The concept is toMethods Multivariate SPC charts utilize high dimensionalThe limits for a multivariate SPC chart can be defined with
Multivariate extensions of the Golden-Thompson inequality
Frank Hansen
2014-07-02T23:59:59.000Z
We study concave trace functions of several operator variables and formulate and prove multivariate generalisations of the Golden-Thompson inequality. The obtained results imply that certain functionals in quantum statistical mechanics have bounds of the same form as they appear in classical physics.
Stable multivariate Eulerian polynomials and generalized Stirling permutations
Haglund, Jim
Stable multivariate Eulerian polynomials and generalized Stirling permutations J. Haglund, Mirk Abstract We study Eulerian polynomials as the generating polynomials of the descent statistic over Stirling Eulerian polyno- mial for permutations, and extends naturally to r-Stirling and generalized Stirling
Statistical analysis of the electrical breakdown time delay distributions in krypton
Maluckov, Cedomir A.; Karamarkovic, Jugoslav P.; Radovic, Miodrag K.; Pejovic, Momcilo M. [Technical Faculty in Bor, University of Belgrade, Vojske Jugoslavije 24, 19210 Bor (Serbia and Montenegro); Faculty of Civil Engineering and Architecture, University of Nis, Beogradska 14, 18000 Nis (Serbia and Montenegro); Faculty of Sciences and Mathematics, University of Nis, P.O. Box 224, 18001 Nis (Serbia and Montenegro); Faculty of Electronic Engineering, University of Nis, P.O. Box 73, 18001 Nis (Serbia and Montenegro)
2006-08-15T23:59:59.000Z
The statistical analysis of the experimentally observed electrical breakdown time delay distributions in the krypton-filled diode tube at 2.6 mbar is presented. The experimental distributions are obtained on the basis of 1000 successive and independent measurements. The theoretical electrical breakdown time delay distribution is evaluated as the convolution of the statistical time delay with exponential, and discharge formative time with Gaussian distribution. The distribution parameters are estimated by the stochastic modelling of the time delay distributions, and by comparing them with the experimental distributions for different relaxation times, voltages, and intensities of UV radiation. The transition of distribution shapes, from Gaussian-type to the exponential-like, is investigated by calculating the corresponding skewness and excess kurtosis parameters. It is shown that the mathematical model based on the convolution of two random variable distributions describes experimentally obtained time delay distributions and the separation of the total breakdown time delay to the statistical and formative time delay.
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01T23:59:59.000Z
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.
Analysis of Large Scale Structure using Percolation, Genus and Shape Statistics
V. Sahni
1998-03-17T23:59:59.000Z
We probe gravitational clustering in N-body simulations using geometrical descriptors sensitive to `connectedness': the genus curve, percolation and shape statistics. We find that both genus and percolation curves provide complementary probes of large scale structure topology and could be used to discriminate between models of structure formation and the analysis of observational data such as galaxy catalogs and MBR maps. An analysis of `shapes' in N-body simulations has shown that filaments are more pronounced than pancakes. To probe shapes of clusters and superclusters more rigorously we propose a new shape statistic which does not fit isodensity surfaces by ellipsoids (as done earlier). Our shape statistic is derived from fundamental properties of a compact body: its Minkowski functionals. The new shape statistic gives sensible results for topologically simple surfaces such as the ellipsoid, and for more complicated surfaces such as the torus. (Invited talk, to appear in: Proceedings of the IAU Symposium No. 183, Kyoto, Japan Aug. 1997, ed. K. Sato, Kluwer Academic Publ.)
Statistical Process Variation Analysis of a Graphene FET based LC-VCO for WLAN Applications
Mohanty, Saraju P.
Statistical Process Variation Analysis of a Graphene FET based LC-VCO for WLAN Applications Md Abir.AbirKhan@my.unt.edu, saraju.mohanty@unt.edu, and elias.kougianos@unt.edu Abstract--Graphene which is a single atom layer-frequency electronics due to low Ion/Ioff ratio. In this paper, design exploration of a graphene FET (GFET) based LC
Development of a thermobalance and analysis of lignites by thermogravimetric and statistical methods
Ferguson, James Allen
1984-01-01T23:59:59.000Z
coal or lignite is its calorific values, Q, the heat it produces when combusted. A simple, inexpensive, and rapid method for accurately predicting Q is needed because the pres- ently available methods are complicated, expensive, and take a great...DEVELOPMENT OF A THERMOBALANCE AND ANALYSIS OF LIGNITES BY THERMOGRAVIMETRIC AND STATISTICAL METHODS A Thesis by JAMES ALLEN PERGUSON Submitted to the Graduate College of Texas ASM University in partial fulfillment of the requirements...
Statistical Analysis of Microgravity Two-Phase Slug Flow via the Drift Flux Model
Larsen, Benjamin A
2014-05-01T23:59:59.000Z
STATISTICAL ANALYSIS OF MICROGRAVITY TWO-PHASE SLUG FLOW VIA THE DRIFT FLUX MODEL A Thesis by BENJAMIN ANDREW LARSEN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment... made their data available to me and willingly took the time to converse about their work. Finally I would like to thank my parents Donald and Christine Larsen for their love and support in completing my graduate work. v NOMENCLATURE Symbol...
A Statistical Analysis of Santa Barbara Ambulance Response in 2006: Performance Under Load
Chang, Joshua C; Schoenberg, Frederic P.
2009-01-01T23:59:59.000Z
The R Language and Environment for Statistical Computing 17R Development Core Team. R: A Language and Environment for Statistical
HotPatch Web Gateway: Statistical Analysis of Unusual Patches on Protein Surfaces
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Pettit, Frank K.; Bowie, James U. [DOE-Molecular Biology Institute
HotPatch finds unusual patches on the surface of proteins, and computes just how unusual they are (patch rareness), and how likely each patch is to be of functional importance (functional confidence (FC).) The statistical analysis is done by comparing your protein's surface against the surfaces of a large set of proteins whose functional sites are known. Optionally, HotPatch can also write a script that will display the patches on the structure, when the script is loaded into some common molecular visualization programs. HotPatch generates complete statistics (functional confidence and patch rareness) on the most significant patches on your protein. For each property you choose to analyze, you'll receive an email to which will be attached a PDB-format file in which atomic B-factors (temp. factors) are replaced by patch indices; and the PDB file's Header Remarks will give statistical scores and a PDB-format file in which atomic B-factors are replaced by the raw values of the property used for patch analysis (for example, hydrophobicity instead of hydrophobic patches). [Copied with edits from http://hotpatch.mbi.ucla.edu/
Advanced statistical methods for eye movement analysis and modeling: a gentle introduction
Boccignone, Giuseppe
2015-01-01T23:59:59.000Z
In this Chapter we show that by considering eye movements, and in particular, the resulting sequence of gaze shifts, a stochastic process, a wide variety of tools become available for analyses and modelling beyond conventional statistical methods. Such tools encompass random walk analyses and more complex techniques borrowed from the pattern recognition and machine learning fields. After a brief, though critical, probabilistic tour of current computational models of eye movements and visual attention, we lay down the basis for gaze shift pattern analysis. To this end, the concepts of Markov Processes, the Wiener process and related random walks within the Gaussian framework of the Central Limit Theorem will be introduced. Then, we will deliberately violate fundamental assumptions of the Central Limit Theorem to elicit a larger perspective, rooted in statistical physics, for analysing and modelling eye movements in terms of anomalous, non-Gaussian, random walks and modern foraging theory. Eventually, by resort...
Columbia University
The statistical entropy (SE) function has been applied to waste treatment systems to account for dilution solid waste (MSW). A greenhouse gas- forcing factor is also introduced to account for the entropyUse of Statistical Entropy and Life Cycle Analysis to Evaluate Global Warming Potential of Waste
MULTIVARIATE CONTROL CHARTS WITH A BAYESIAN Sylvain VERRON, Teodor TIPLICA, Abdessamad KOBI
Paris-Sud XI, UniversitÃ© de
of a multivariate process with a bayesian network. As a discriminant analysis is easily modeled with a bayesian cases of the discriminant analysis. So, we give the structure of the bayesian network as well as the parameters of the network in order to detect faults in the multivariate space in the same manners as if we
Noise-Based Volume Rendering for the Visualization of Multivariate Volumetric Data
Tierny, Julien
to meteorology and ge- ology require the analysis of such multivariate data and are in need of a comprehensiveNoise-Based Volume Rendering for the Visualization of Multivariate Volumetric Data Rostislav-velocity upstream (cyan). This is impossible with normal direct volume rendering (d). Abstract--Analysis
DECOMPOSITION OF MULTIVARIATE DATASETS WITH STRUCTURE/ORDERING
analysis. However, contrary to Fourier decomposition these new variables are located in frequency as well as location (space, time, wavelength etc). 1 Introduction The maximum autocorrelation factor (MAF) analysisDECOMPOSITION OF MULTIVARIATE DATASETS WITH STRUCTURE/ORDERING OF OBSERVATIONS OR VARIABLES USING
1 Introduction Towards Better Graphics for Multivariate Anal
Thioulouse, Jean
1 Introduction Summary Keywords: Towards Better Graphics for Multivariate AnalÂ ysis, exploratory data analysis, interactive graphics, factor map, principal component analysis Dynamic graphicsÂfriendly graphical interfaces. Their help in analysing numerical data sets has been widely recognized, and the book
Statistical and risk analysis for the measured and predicted axial response of 100 piles
Perdomo, Dario
1986-01-01T23:59:59.000Z
2. Load Settlement Curves "Good Guess" Example for Actual Curve(from Briaud et al, 1985) TOP LOAD (HIPS) 0 0 100 200 300 400 500 600 iJ PILE I. D. NO. 28 Actual Wild Guess + Penpile- Verbrugge (L. P. C. Cone) Coyle Briaud-Tucker L. P. C... ' 7. 2 Mgbinpina1. JLDglxaim ~ The results of the statistical analysis are shown in Tables 10 600 DRIVEN PILES SAND COYLE'S METHOD 500 M 0 400 (A o o UJ 300 X I? O UJ I- 200 O L0 K 100 No. OF PILES=S MEAN PRED/MEAS=1. 01...
Hofland, G.S.; Barton, C.C.
1990-10-01T23:59:59.000Z
The computer program FREQFIT is designed to perform regression and statistical chi-squared goodness of fit analysis on one-dimensional or two-dimensional data. The program features an interactive user dialogue, numerous help messages, an option for screen or line printer output, and the flexibility to use practically any commercially available graphics package to create plots of the program`s results. FREQFIT is written in Microsoft QuickBASIC, for IBM-PC compatible computers. A listing of the QuickBASIC source code for the FREQFIT program, a user manual, and sample input data, output, and plots are included. 6 refs., 1 fig.
Applications of Minkowski Functionals to the Statistical Analysis of Dark Matter Models
Michael Platzoeder; Thomas Buchert
1995-09-04T23:59:59.000Z
A new method for the statistical analysis of 3D point processes, based on the family of Minkowski functionals, is explained and applied to modelled galaxy distributions generated by a toy-model and cosmological simulations of the large-scale structure in the Universe. These measures are sensitive to both, geometrical and topological properties of spatial patterns and appear to be very effective in discriminating different point processes. Moreover by the means of conditional subsampling, different building blocks of large-scale structures like sheets, filaments and clusters can be detected and extracted from a given distribution.
In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units
Ranjan, Niloo [ORNL; Sanyal, Jibonananda [ORNL; New, Joshua Ryan [ORNL
2013-08-01T23:59:59.000Z
Developing accurate building energy simulation models to assist energy efficiency at speed and scale is one of the research goals of the Whole-Building and Community Integration group, which is a part of Building Technologies Research and Integration Center (BTRIC) at Oak Ridge National Laboratory (ORNL). The aim of the Autotune project is to speed up the automated calibration of building energy models to match measured utility or sensor data. The workflow of this project takes input parameters and runs EnergyPlus simulations on Oak Ridge Leadership Computing Facility s (OLCF) computing resources such as Titan, the world s second fastest supercomputer. Multiple simulations run in parallel on nodes having 16 processors each and a Graphics Processing Unit (GPU). Each node produces a 5.7 GB output file comprising 256 files from 64 simulations. Four types of output data covering monthly, daily, hourly, and 15-minute time steps for each annual simulation is produced. A total of 270TB+ of data has been produced. In this project, the simulation data is statistically analyzed in-situ using GPUs while annual simulations are being computed on the traditional processors. Titan, with its recent addition of 18,688 Compute Unified Device Architecture (CUDA) capable NVIDIA GPUs, has greatly extended its capability for massively parallel data processing. CUDA is used along with C/MPI to calculate statistical metrics such as sum, mean, variance, and standard deviation leveraging GPU acceleration. The workflow developed in this project produces statistical summaries of the data which reduces by multiple orders of magnitude the time and amount of data that needs to be stored. These statistical capabilities are anticipated to be useful for sensitivity analysis of EnergyPlus simulations.
Bayesian Calibration of Expensive Multivariate
Oakley, Jeremy
Bayesian Calibration of Expensive Multivariate Computer Experiments Richard D. Wilkinson University of Sheffield This chapter is concered with how to calibrate a computer model to observational data when approach to calibration described here was first given by Kennedy and O'Hagan (2001). Their approach
Jagadheep D. Pandian; Paul F. Goldsmith
2007-08-23T23:59:59.000Z
We present an analysis of the properties of the 6.7 GHz methanol maser sample detected in the Arecibo Methanol Maser Galactic Plane Survey. The distribution of the masers in the Galaxy, and statistics of their multi-wavelength counterparts is consistent with the hypothesis of 6.7 GHz maser emission being associated with massive young stellar objects. Using the detection statistics of our survey, we estimate the minimum number of methanol masers in the Galaxy to be 1275. The l-v diagram of the sample shows the tangent point of the Carina-Sagittarius spiral arm to be around 49.6 degrees, and suggests occurrence of massive star formation along the extension of the Crux-Scutum arm. A Gaussian component analysis of the maser spectra shows the mean line-width to be 0.38 km/s which is more than a factor of two larger than what has been reported in the literature. We also find no evidence that faint methanol masers have different properties than those of their bright counterparts.
Interactive statistical-distribution-analysis program utilizing numerical and graphical methods
Glandon, S. R.; Fields, D. E.
1982-04-01T23:59:59.000Z
The TERPED/P program is designed to facilitate the quantitative analysis of experimental data, determine the distribution function that best describes the data, and provide graphical representations of the data. This code differs from its predecessors, TEDPED and TERPED, in that a printer-plotter has been added for graphical output flexibility. The addition of the printer-plotter provides TERPED/P with a method of generating graphs that is not dependent on DISSPLA, Integrated Software Systems Corporation's confidential proprietary graphics package. This makes it possible to use TERPED/P on systems not equipped with DISSPLA. In addition, the printer plot is usually produced more rapidly than a high-resolution plot can be generated. Graphical and numerical tests are performed on the data in accordance with the user's assumption of normality or lognormality. Statistical analysis options include computation of the chi-squared statistic and its significance level and the Kolmogorov-Smirnov one-sample test confidence level for data sets of more than 80 points. Plots can be produced on a Calcomp paper plotter, a FR80 film plotter, or a graphics terminal using the high-resolution, DISSPLA-dependent plotter or on a character-type output device by the printer-plotter. The plots are of cumulative probability (abscissa) versus user-defined units (ordinate). The program was developed on a Digital Equipment Corporation (DEC) PDP-10 and consists of 1500 statements. The language used is FORTRAN-10, DEC's extended version of FORTRAN-IV.
Analysis on the Inverse problem Statistical analysis of the inverse problem
regression This is a non-linear regression model. Assumption : we have equal variance measurement errors and trigonometric forms. #12;Analysis on the Inverse problem Introduction Non-linear regression This is a non-linear on the Inverse problem Introduction Linear and non-linear regression Examples : Linear model y = 0 + 1x + 2x2 y
Symbolic Discriminant Analysis for Mining Gene Expression Patterns Jason Moore
Fernandez, Thomas
multivariate statistical approach for classification of observations into groups because the theory is wellSymbolic Discriminant Analysis for Mining Gene Expression Patterns Jason Moore Vanderbilt: Leukemia Jason H. Moore, Joel S. Parker, Lance W. Hahn Linear discriminant analysis is a popular
Stability and error analysis on partially implicit schemes Department of Mathematics and Statistics
Sun, Tong
and Statistics Bowling Green State University, Bowling Green, OH 43403 Abstract. Subdomain techniques have been
The University of Chicago Department of Statistics
Stephens, Matthew
of Statistics The University of Chicago Market Efficiency of Crude Oil Futures; A Multivariate Approach THURSDAY, May 12, 2011, at 4:00 PM 110 Eckhart Hall, 5734 S. University Avenue ABSTRACT Crude oil futures commodity, and thus constitute a multivariate framework for describing the cointegrating relationship among
Time varying, multivariate volume data reduction
Ahrens, James P [Los Alamos National Laboratory; Fout, Nathaniel [UC DAVIS; Ma, Kwan - Liu [UC DAVIS
2010-01-01T23:59:59.000Z
Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the advantage of offering smooth animations, while spatial compression can handle volumes of larger dimensions.
Statistical Model Analysis of (n,p) Cross Sections and Average Energy For Fission Neutron Spectrum
Odsuren, M.; Khuukhenkhuu, G. [Nuclear Research Center, National University of Mongolia, Ulaanbaatar (Mongolia)
2011-06-28T23:59:59.000Z
Investigation of charged particle emission reaction cross sections for fast neutrons is important to both nuclear reactor technology and the understanding of nuclear reaction mechanisms. In particular, the study of (n,p) cross sections is necessary to estimate radiation damage due to hydrogen production, nuclear heating and transmutations in the structural materials of fission and fusion reactors. On the other hand, it is often necessary in practice to evaluate the neutron cross sections of the nuclides for which no experimental data are available.Because of this, we carried out the systematical analysis of known experimental (n,p) and (n,a) cross sections for fast neutrons and observed a systematical regularity in the wide energy interval of 6-20 MeV and for broad mass range of target nuclei. To explain this effect using the compound, pre-equilibrium and direct reaction mechanisms some formulae were deduced. In this paper, in the framework of the statistical model known experimental (n,p) cross sections averaged over the thermal fission neutron spectrum of U-235 are analyzed. It was shown that the experimental data are satisfactorily described by the statistical model. Also, in the case of (n,p) cross sections the effective average neutron energy for fission spectrum of U-235 was found to be around 3 MeV.
Schrull, Jeffrey Lee
1987-01-01T23:59:59.000Z
T'A'0-DIMENSIONAL SPECTRAL/STATISTICAL ANAI. YSIS OF MARINE MAGNETIC DATA: IMPLICATIONS FOR DEPTH-TO-MAGNETIC SOURCE A Thesis by JEFFREY LEE SCHRULL Submitted to the Graduate College of Texas AdtM University in partial fulfillment... of the requirements for the degree of MASTER OF SCIENCE May 1987 Major Subject: Geophysics TWO-DIMENSIONAL SPECTRAL/STATISTICAL ANALYSIS OF MARINE MAGNETIC DATA: IMPLICATIONS FOR DEPTH-TO-MAGNETIC SOURCE A Thesis JEFFREY LEE SCHRULL Approved as to style...
Stellar Evolution A Statistical Model
van Dyk, David
Stellar Evolution A Statistical Model Statistical Computation Analysis of the Hyades Cluster Statistical Analysis of Stellar Evolution David A. van Dyk1 Steven DeGennaro2 Nathan Stein2 William H Statistical Analysis of Stellar Evolution #12;Stellar Evolution A Statistical Model Statistical Computation
Near-Infrared Detection of Flow Injection Analysis by Acoustooptic Tunable Filter-Based
Reid, Scott A.
pretreatment), and the availability of powerful and effective multivariate statistical methods for data wavelength. As a consequence, the multivariate calibration methods cannot be used to analyze data
Frome, EL
2005-09-20T23:59:59.000Z
Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.
Maximum Likelihood Estimation of Mixture Densities for Binned and Truncated Multivariate
Smyth, Padhraic
Maximum Likelihood Estimation of Mixture Densities for Binned and Truncated Multivariate Data in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (1988
Computing Large Sparse Multivariate Optimization Problems with an Application in Biophysics
Boppana, Rajendra V.
-performance computational techniques for this final phase of the analysis are well-known and are not the focus of this paperComputing Large Sparse Multivariate Optimization Problems with an Application in Biophysics Abstract We present a novel divide and conquer method for parallelizing a large scale multivariate linear
Wolfrum, E. J.; Sluiter, A. D.
2009-01-01T23:59:59.000Z
We have studied rapid calibration models to predict the composition of a variety of biomass feedstocks by correlating near-infrared (NIR) spectroscopic data to compositional data produced using traditional wet chemical analysis techniques. The rapid calibration models are developed using multivariate statistical analysis of the spectroscopic and wet chemical data. This work discusses the latest versions of the NIR calibration models for corn stover feedstock and dilute-acid pretreated corn stover. Measures of the calibration precision and uncertainty are presented. No statistically significant differences (p = 0.05) are seen between NIR calibration models built using different mathematical pretreatments. Finally, two common algorithms for building NIR calibration models are compared; no statistically significant differences (p = 0.05) are seen for the major constituents glucan, xylan, and lignin, but the algorithms did produce different predictions for total extractives. A single calibration model combining the corn stover feedstock and dilute-acid pretreated corn stover samples gave less satisfactory predictions than the separate models.
Stellar Evolution A Statistical Model
van Dyk, David
Stellar Evolution A Statistical Model Statistical Computation Analysis of the Hyades Cluster Embedding Computer Models for Stellar Evolution into a Coherent Statistical Analysis David A. van Dyk1 Analysis of Stellar Evolution #12;Stellar Evolution A Statistical Model Statistical Computation Analysis
A statistical analysis of avalanching heat transport in stationary enhanced core confinement regimes
Tokunaga, S.; Jhang, Hogun; Kim, S. S. [WCI Center for Fusion Theory, National Fusion Research Institute, 52, Yeoeun-dong, Yusung-Gu, Daejon (Korea, Republic of); Diamond, P. H. [WCI Center for Fusion Theory, National Fusion Research Institute, 52, Yeoeun-dong, Yusung-Gu, Daejon (Korea, Republic of); Center for Astrophysics and Space Sciences and Department of Physics, University of California San Diego, La Jolla, California 92093-0429 (United States)
2012-09-15T23:59:59.000Z
We present a statistical analysis of heat transport in stationary enhanced confinement regimes obtained from flux-driven gyrofluid simulations. The probability density functions of heat flux in improved confinement regimes, characterized by the Nusselt number, show significant deviation from Gaussian, with a markedly fat tail, implying the existence of heat avalanches. Two types of avalanching transport are found to be relevant to stationary states, depending on the degree of turbulence suppression. In the weakly suppressed regime, heat avalanches occur in the form of quasi-periodic (QP) heat pulses. Collisional relaxation of zonal flow is likely to be the origin of these QP heat pulses. This phenomenon is similar to transient limit cycle oscillations observed prior to edge pedestal formation in recent experiments. On the other hand, a spectral analysis of heat flux in the strongly suppressed regime shows the emergence of a 1/f (f is the frequency) band, suggesting the presence of self-organized criticality (SOC)-like episodic heat avalanches. This episodic 1/f heat avalanches have a long temporal correlation and constitute the dominant transport process in this regime.
Analysis of the effect of climate change on the yield of crops in Turkey using a statistical
Kurnaz, Levent
Analysis of the effect of climate change on the yield of crops in Turkey using a statistical, BoaziÃ§i University, 34342, Istanbul, Turkey 2 Institute of Environmental Sciences, BoaziÃ§i University, 34342, Istanbul, Turkey *corresponding author e-mail: hamza.altinsoy@boun.edu.tr Abstract In this study
Saldin, Dilano
Actuarial science is the quantitative analysis of risk. In addition to mathematics and statistics. Actuaries help individuals, businesses and society manage risk by evaluating the likelihood of future events's risk tolerance with various risk parameters such as age of the insured, health status, place
Statistical analysis of Multi-Material Components using Dual Energy CT Christoph Heinzl, Johann plastics-metal components. The presented work makes use of dual energy CT data acquisi- tion for artefact pipeline based on the dual ex- posure technique of dual energy CT. After prefilter- ing and multi
Varvarigo, Emmanouel "Manos"
Delay Components of Job Processing in a Grid: Statistical Analysis and Modeling K}@ceid.upatras.gr Abstract The existence of good probabilistic models for the job arrival process and the delay components introduced at the different stages of job processing in a Grid environment is important for the improved
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan, E-mail: druan@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States)] [Department of Radiation Oncology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States)
2014-03-15T23:59:59.000Z
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.
Statistical Analysis of Microarray Data with Replicated Spots: A Case Study withSynechococcusWH8102
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.
2009-01-01T23:59:59.000Z
Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmore »to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.« less
Statistical Analysis of Baseline Load Models for Non-Residential Buildings
Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila
2008-11-10T23:59:59.000Z
Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.
Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents
Wheatley, Spencer; Sornette, Didier
2015-01-01T23:59:59.000Z
We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...
Multivariate Calibration Models for Sorghum Composition using...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Multivariate Calibration Models for Sorghum Composition using Near-Infrared Spectroscopy E. Wolfrum and C. Payne National Renewable Energy Laboratory T. Stefaniak and W. Rooney...
Brown, Emery N.
Coherence analysis characterizes frequency-dependent covariance between signals, and is useful for multivariate oscillatory data often encountered in neuroscience. The global coherence provides a summary of coherent behavior ...
Li, Mohan
2012-10-19T23:59:59.000Z
of decorrelation in the RF data. In this thesis, a 3D elastography algorithm that estimates all the three components of tissue displacement is implemented and tested statistically. In this research, displacement fields of mechanical models are simulated. RF signals...
Causation-Based T2 Decomposition for Multivariate Process Monitoring and Diagnosis
Jin, Jionghua "Judy"
. Multivariate SPC using Hotelling 2 T statistic is widely adopted for change detection. However, 2 T control chart alone is not capable of identifying the root causes of the change. Thus, decomposition of 2 network, causal model, SPC, 2 T decomposition Biography Ms. Li is a research student in the Department
Context-free Grammars and Multivariate Stable Polynomials over Stirling Permutations
Chen, Bill
Context-free Grammars and Multivariate Stable Polynomials over Stirling Permutations William Y of some results of BÂ´ona, Brenti, Janson, Kuba, and Panholzer concerning Stirling permutations. Let Bn(x) be the generating polynomials of the descent statistic over Legendre-Stirling permutations, and let Tn(x) = 2n Cn
adaptable multivariate calibration: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Det. 0-18 Multiples Det. >18 pseudo Strong, Andrew W. 54 OPTIMAL SOLUTIONS OF MULTIVARIATE COUPLING Mathematics Websites Summary: OPTIMAL SOLUTIONS OF MULTIVARIATE COUPLING...
Dynamic Conditional Correlation - A Simple Class of Multivariate GARCH Models
Engle, Robert F
2000-01-01T23:59:59.000Z
Multivariate Simultaneous GARCH," Econometric Theory 11,and Joseph Mezrich, (1996) "GARCH for Groups," Risk August,SIMPLE CLASS OF MULTIVARIATE GARCH MODELS BY ROBERT F. ENGLE
Weck, Peter J; Brown, Michael R; Wicks, Robert T
2014-01-01T23:59:59.000Z
The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD) and fully-developed turbulent magnetic fluctuations of the solar wind taken from the WIND spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge fluctuations. The CH ...
Thioulouse, Jean
Analysis in SAR and Environmental Studies. Kluwer Academic Publishers. http://pbil.univ-lyon1.fr/R/articles Analysis in SAR and Environmental Studies. Kluwer Academic Publishers. http://pbil.univ-lyon1.fr/R/articles Analysis in SAR and Environmental Studies. Kluwer Academic Publishers. http://pbil.univ-lyon1.fr/R/articles
Statistical Analysis of High-Cycle Fatigue Behavior of Friction Stir Welded AA5083-H321
Grujicic, Mica
-hardened/stabilized Al-Mg-Mn alloy) are characterized by a relatively large statistical scatter. This scatter is closely process is particularly suited for butt and lap joining of aluminum alloys which are otherwise quite such as shipbuilding/marine, aerospace, railway, land transportation, etc. The basic concept behind the FSW process
Not Available
1986-10-01T23:59:59.000Z
Statistical analyses were performed on 4 years of fluoride emissions data from a primary aluminum reduction plant. These analyses were used to develop formulae and procedures for use by regulatory agencies in determining alternate sampling frequencies for secondary (roof monitor) emissions testing on a case-by-case basis. Monitoring procedures for ensuring compliance even with a reduced test frequency are also addressed.
ERROR MODELS FOR LIGHT SENSORS BY STATISTICAL ANALYSIS OF RAW SENSOR MEASUREMENTS
Potkonjak, Miodrag
silicon solar cell that converts light impulses directly into electrical charges that can easily-based systems including calibration, sensor fusion and power management. We developed a system of statistical the standard procedure is to use error models to enable calibration, in a variant of our approach, we use
Garcia-Garcia, Adrian Luis, E-mail: agarciag@ipn.mx; Dominguez-Lopez, Ivan, E-mail: idominguezl@ipn.mx; Lopez-Jimenez, Luis, E-mail: llopez1002@ipn.mx; Barceinas-Sanchez, J.D. Oscar, E-mail: obarceinas@ipn.mx
2014-01-15T23:59:59.000Z
Quantification of nanometric precipitates in metallic alloys has been traditionally performed using transmission electron microscopy, which is nominally a low throughput technique. This work presents a comparative study of quantification of ?? and ? precipitates in aluminum alloy AA7075-T651 using transmission electron microscopy (TEM) and non-contact atomic force microscopy (AFM). AFM quantification was compared with 2-D stereological results reported elsewhere. Also, a method was developed, using specialized software, to characterize nanometric size precipitates observed in dark-field TEM micrographs. Statistical analysis of the quantification results from both measurement techniques supports the use of AFM for precipitate characterization. Once the precipitate stoichiometry has been determined by appropriate analytical techniques like TEM, as it is the case for ?? and ? in AA7075-T651, the relative ease with which specimens are prepared for AFM analysis could be advantageous in product and process development, and quality control, where a large number of samples are expected for analysis on a regular basis. - Highlights: • Nanometric MgZn{sub 2} precipitates in AA7075-T651 were characterized using AFM and TEM. • Phase-contrast AFM was used to differentiate metal matrix from MgZn{sub 2} precipitates. • TEM and AFM micrographs were analyzed using commercially available software. • AFM image analysis and TEM 2-D stereology render statistically equivalent results.
Zeros in linear multivariable control systems
Ewing, Robert Fennell
1974-01-01T23:59:59.000Z
ZEROS IN LINEAR MULTIVARIABLE CONTROL SYSTEMS A Thesis by ROBERT FENNELL EWING Submitted to the Graduate College of Texas A8M University in partial fulfillment of the requirement for the degree of MASTER OF SCIENCE August 1974 Major... Control Systems (August 1974) Robert Fennell Ewing, B. S. , Texas A&M University Chairman of Advisory Committee: Dr. J. W. Howze This thesis examines the problem of altering the transfer function matrix of a linear, time-invariant, multivariable system...
SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix
Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D [University of Pittsburgh Medical Center, Pittsburgh, PA (United States)
2014-06-01T23:59:59.000Z
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.
Al-Nasir, Abdul Majid Hamza
1968-01-01T23:59:59.000Z
ORDER RELATIONS AND PRIOR DISTRIBU 'IONS IN 1:-IE ESTXYJiTION OF MULTIVARIATE NOPSLAL PARAI'E&iTiS NI~N PARTIAL DATA A Thesis by ABDUL MAJID HA?ZA AL-NASZR Submitt d o the Grad. nate College oi' Texas UM Univ rsity in partial fu' fillment . f... the requirement for the aegree of MASTER OF SCIENCE August 1968 Major Subject: Statistics ORDER RELATIONS AND PRIOR DISTRIBUTIONS IJJ THE ESTI1UTION OF MULTIVARIATE NORJJAL PARtuETERS NlTH PARTIAL DATA A Thesis ( by ABDUL IJAJID HANZA AL-NASIR Approved...
Statistical Analysis of LifeData with Masked CauseofFailure
Basu, Sanjib
, a detailed Failure Mode Effect Analysis (FMEA) can be carried out in a routine manner. In reliability been pursued under the general heading of Failure Mode Effect Analysis (FMEA) when the exact causes
Czarnecki, Krzysztof
2012 Beaver Computing Challenge Results Sponsor: 1 #12;Statistics Overall Statistics Number.24/6 Glasses: 1.74/4 Text Machine: 5.54/8 Hierarchical Structure: 2.98/6 Bebrocarina: 2.14/4 Beaver Navigation: 5.06/8 Beaver Pyramid: 3.23/6 Change Direction: 1.45/4 Power Generation: 5.56/8 Rotating Puzzle: 5
Extracting bb Higgs Decay Signals using Multivariate Techniques
Smith, W Clarke; /George Washington U. /SLAC
2012-08-28T23:59:59.000Z
For low-mass Higgs boson production at ATLAS at {radical}s = 7 TeV, the hard subprocess gg {yields} h{sup 0} {yields} b{bar b} dominates but is in turn drowned out by background. We seek to exploit the intrinsic few-MeV mass width of the Higgs boson to observe it above the background in b{bar b}-dijet mass plots. The mass resolution of existing mass-reconstruction algorithms is insufficient for this purpose due to jet combinatorics, that is, the algorithms cannot identify every jet that results from b{bar b} Higgs decay. We combine these algorithms using the neural net (NN) and boosted regression tree (BDT) multivariate methods in attempt to improve the mass resolution. Events involving gg {yields} h{sup 0} {yields} b{bar b} are generated using Monte Carlo methods with Pythia and then the Toolkit for Multivariate Analysis (TMVA) is used to train and test NNs and BDTs. For a 120 GeV Standard Model Higgs boson, the m{sub h{sup 0}}-reconstruction width is reduced from 8.6 to 6.5 GeV. Most importantly, however, the methods used here allow for more advanced m{sub h{sup 0}}-reconstructions to be created in the future using multivariate methods.
Development of Statistical Energy Analysis Tools for Toyota Motor Engineering & Manufacturing
Chen, J; Collins, Ro.; Gao, G.; Schaffer, D.; Wu, J.
2014-01-01T23:59:59.000Z
level • Body Weld – little statistical difference between plants ? Fixed • Plant A best overall • Plant D and Plant E generally worse ESL-IE-14-05-06 Proceedings of the Thrity-Sixth Industrial Energy Technology Conference New Orleans, LA. May 20-23, 2014... 20-23, 2014 Shop Efficiency Rankings - Variable Electricity Paint Assembly Body Weld Plastic Stamp 1 Plant A Plant D Plant C Plant C Plant C 2 Plant E Plant E Plant D Plant A Plant E 3 Plant C Plant A Plant F Plant E Plant A 4 Plant D Plant C Plant A...
Development of Statistical Energy Analysis Tools for Toyota Motor Engineering & Manufacturing
Chen, J; Collins, Ro.; Gao, G.; Schaffer, D.; Wu, J.
2014-01-01T23:59:59.000Z
level • Body Weld – little statistical difference between plants ? Fixed • Plant A best overall • Plant D and Plant E generally worse ESL-IE-14-05-06 Proceedings of the Thrity-Sixth Industrial Energy Technology Conference New Orleans, LA. May 20-23, 2014... 20-23, 2014 Shop Efficiency Rankings - Variable Electricity Paint Assembly Body Weld Plastic Stamp 1 Plant A Plant D Plant C Plant C Plant C 2 Plant E Plant E Plant D Plant A Plant E 3 Plant C Plant A Plant F Plant E Plant A 4 Plant D Plant C Plant A...
Applications of Universal Source Coding to Statistical Analysis of Time Series
Ryabko, Boris
2008-01-01T23:59:59.000Z
We show how universal codes can be used for solving some of the most important statistical problems for time series. By definition, a universal code (or a universal lossless data compressor) can compress any sequence generated by a stationary and ergodic source asymptotically to the Shannon entropy, which, in turn, is the best achievable ratio for lossless data compressors. We consider finite-alphabet and real-valued time series and the following problems: estimation of the limiting probabilities for finite-alphabet time series and estimation of the density for real-valued time series, the on-line prediction, regression, classification (or problems with side information) for both types of the time series and the following problems of hypothesis testing: goodness-of-fit testing, or identity testing, and testing of serial independence. It is important to note that all problems are considered in the framework of classical mathematical statistics and, on the other hand, everyday methods of data compression (or ar...
Guo, Genliang; George, S.A.; Lindsey, R.P.
1997-08-01T23:59:59.000Z
Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.
ibr: Iterative bias reduction multivariate smoothing
Hengartner, Nicholas W [Los Alamos National Laboratory; Cornillon, Pierre-andre [AGRO-SUP, FRANCE; Matzner - Lober, Eric [RENNES 2, FRANCE
2009-01-01T23:59:59.000Z
Regression is a fundamental data analysis tool for relating a univariate response variable Y to a multivariate predictor X {element_of} E R{sup d} from the observations (X{sub i}, Y{sub i}), i = 1,...,n. Traditional nonparametric regression use the assumption that the regression function varies smoothly in the independent variable x to locally estimate the conditional expectation m(x) = E[Y|X = x]. The resulting vector of predicted values {cflx Y}{sub i} at the observed covariates X{sub i} is called a regression smoother, or simply a smoother, because the predicted values {cflx Y}{sub i} are less variable than the original observations Y{sub i}. Linear smoothers are linear in the response variable Y and are operationally written as {cflx m} = X{sub {lambda}}Y, where S{sub {lambda}} is a n x n smoothing matrix. The smoothing matrix S{sub {lambda}} typically depends on a tuning parameter which we denote by {lambda}, and that governs the tradeoff between the smoothness of the estimate and the goodness-of-fit of the smoother to the data by controlling the effective size of the local neighborhood over which the responses are averaged. We parameterize the smoothing matrix such that large values of {lambda} are associated to smoothers that averages over larger neighborhood and produce very smooth curves, while small {lambda} are associated to smoothers that average over smaller neighborhood to produce a more wiggly curve that wants to interpolate the data. The parameter {lambda} is the bandwidth for kernel smoother, the span size for running-mean smoother, bin smoother, and the penalty factor {lambda} for spline smoother.
Reddy, T. A.; Claridge, D.; Wu, J.
analysis to identify these models. However, such models tend to suffer from physically unreasonable regression coefficients and instability due to the fact that the predictor variables (i.e., climatic parameters, building internal loads, etc...
The Speedup-Test: A Statistical Methodology for Program Speedup Analysis and Computation
Paris-Sud XI, Université de
TOUATI , Julien WORMS, S´ebastien BRIAIS May 2012 Abstract In the area of high performance computing improvement compared to the usual performance analysis method in high performance computing. We explain
Rebound 2007: Analysis of U.S. Light-Duty Vehicle Travel Statistics
Greene, David L [ORNL
2010-01-01T23:59:59.000Z
U.S. national time series data on vehicle travel by passenger cars and light trucks covering the period 1966 2007 are used to test for the existence, size and stability of the rebound effect for motor vehicle fuel efficiency on vehicle travel. The data show a statistically significant effect of gasoline price on vehicle travel but do not support the existence of a direct impact of fuel efficiency on vehicle travel. Additional tests indicate that fuel price effects have not been constant over time, although the hypothesis of symmetry with respect to price increases and decreases is not rejected. Small and Van Dender (2007) model of a declining rebound effect with income is tested and similar results are obtained.
Dr. Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2012-10-01T23:59:59.000Z
As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.
Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2014-05-01T23:59:59.000Z
As part of the High Temperature Reactors (HTR) R&D program, a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. While not possible to obtain by direct measurements in the tests, crucial fuel conditions (e.g., temperature, neutron fast fluence, and burnup) are calculated using core physics and thermal modeling codes. This paper is focused on AGR test fuel temperature predicted by the ABAQUS code's finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for qualification of AGR-1 thermocouple data. Abnormal trends in measured data revealed by the statistical analysis are traced to either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. The main thrust of this work is to exploit the variety of data obtained in irradiation and post-irradiation examination (PIE) for assessment of modeling assumptions. As an example, the uneven reduction of the control gas gap in Capsule 5 found in the capsule metrology measurements in PIE helps identify mechanisms other than TC drift causing the decrease in TC readings. This suggests a more physics-based modification of the thermal model that leads to a better fit with experimental data, thus reducing model uncertainty and increasing confidence in the calculated fuel temperatures of the AGR-1 test.
Statistical Laboratory & Department of Statistics
Statistical Laboratory & Department of Statistics Annual Report July 1, 2005 to December 31, 2006...............................................33 Statistical Computing Section ......................................34 CSSM and statistical methodology in the nutritional sciences. We were also very pleased to secure a permanent lecturer
Statistical static timing analysis considering the impact of power supply noise in VLSI circuits
Kim, Hyun Sung
2009-06-02T23:59:59.000Z
less random than between the gates within a module. 32 REFERENCES [1] Y. M. Jiang and K. T. Cheng, ?Analysis of Performance Impact Caused by Power Supply Noise in Deep Submicron Devices,? ACM/IEEE Design Automation Conf., New... Orleans, LA, June 1999, pp. 760-765. [2] S. Pant, D. Blaauw, V. Zolotov, S. Sundareswaran and R. Panda, ?Vectorless Analysis of Supply Noise Induced Delay Variation,? IEEE/ACM Int?l Conf. Computer Aided Design, San Jose, CA, Nov. 2003, pp. 184-191. [3...
Statistical static timing analysis considering the impact of power supply noise in VLSI circuits
Kim, Hyun Sung
2009-06-02T23:59:59.000Z
less random than between the gates within a module. 32 REFERENCES [1] Y. M. Jiang and K. T. Cheng, ?Analysis of Performance Impact Caused by Power Supply Noise in Deep Submicron Devices,? ACM/IEEE Design Automation Conf., New... Orleans, LA, June 1999, pp. 760-765. [2] S. Pant, D. Blaauw, V. Zolotov, S. Sundareswaran and R. Panda, ?Vectorless Analysis of Supply Noise Induced Delay Variation,? IEEE/ACM Int?l Conf. Computer Aided Design, San Jose, CA, Nov. 2003, pp. 184-191. [3...
Statistical analysis of the overnight and daytime return Fengzhong Wang,1
Stanley, H. Eugene
are of great importance for economics and econophysics research 121 . A key topic of the market studies of the financial markets. Practically, this study can help traders to improve trading strategies at the market open, there is still lack of a comprehen- sive analysis of the overnight and daytime price change for a leading market
A diagnostic procedure for multivariate quality control
Keserla, Adhinarayan A.
1993-01-01T23:59:59.000Z
. The proposed diagnostic procedure is triggered by off-target signals from the multivariate control chart. The performance of the procedure are investigate in conjunction with the two control charts, the X2 chart and the MC1 chart. Two performance measures...
Multivariate Optical Computation for Predictive Spectroscopy
Myrick, Michael Lenn
Multivariate Optical Computation for Predictive Spectroscopy Matthew P. Nelson, Jeffrey F. Aust, J Research Council of Canada, Ottawa, Ontario, Canada K1A 0R6 A novel optical approach to predicting chemical into the structure of a set of paired optical filters. Light passing through the paired filters produces an analog
Inoculating Multivariate Schemes Against Differential Attacks
adding as few as 10 Plus poly- nomials to the Perturbed Matsumoto-Imai (PMI) cryptosystem when g = 1 scheme the Perturbed Matsumoto-Imai-Plus (PMI+) cryptosystem. Keywords: multivariate, public key, called the perturbed Matsumoto-Imai cryptosystem (PMI), is slower as one needs to go through a search
Multivariate Calibration Models for Sorghum Composition using Near-Infrared Spectroscopy
Wolfrum, E.; Payne, C.; Stefaniak, T.; Rooney, W.; Dighe, N.; Bean, B.; Dahlberg, J.
2013-03-01T23:59:59.000Z
NREL developed calibration models based on near-infrared (NIR) spectroscopy coupled with multivariate statistics to predict compositional properties relevant to cellulosic biofuels production for a variety of sorghum cultivars. A robust calibration population was developed in an iterative fashion. The quality of models developed using the same sample geometry on two different types of NIR spectrometers and two different sample geometries on the same spectrometer did not vary greatly.
Statistical analysis of liquid seepage in partially saturated heterogeneous fracture systems
Liou, T.S.
1999-12-01T23:59:59.000Z
Field evidence suggests that water flow in unsaturated fracture systems may occur along fast preferential flow paths. However, conventional macroscale continuum approaches generally predict the downward migration of water as a spatially uniform wetting front subjected to strong inhibition into the partially saturated rock matrix. One possible cause of this discrepancy may be the spatially random geometry of the fracture surfaces, and hence, the irregular fracture aperture. Therefore, a numerical model was developed in this study to investigate the effects of geometric features of natural rock fractures on liquid seepage and solute transport in 2-D planar fractures under isothermal, partially saturated conditions. The fractures were conceptualized as 2-D heterogeneous porous media that are characterized by their spatially correlated permeability fields. A statistical simulator, which uses a simulated annealing (SA) algorithm, was employed to generate synthetic permeability fields. Hypothesized geometric features that are expected to be relevant for seepage behavior, such as spatially correlated asperity contacts, were considered in the SA algorithm. Most importantly, a new perturbation mechanism for SA was developed in order to consider specifically the spatial correlation near conditioning asperity contacts. Numerical simulations of fluid flow and solute transport were then performed in these synthetic fractures by the flow simulator TOUGH2, assuming that the effects of matrix permeability, gas phase pressure, capillary/permeability hysteresis, and molecular diffusion can be neglected. Results of flow simulation showed that liquid seepage in partially saturated fractures is characterized by localized preferential flow, along with bypassing, funneling, and localized ponding. Seepage pattern is dominated by the fraction of asperity contracts, and their shape, size, and spatial correlation. However, the correlation structure of permeability field is less important than the spatial correlation of asperity contacts. A faster breakthrough was observed in fractures subjected to higher normal stress, accompanied with a nonlinearly decreasing trend of the effective permeability. Interestingly, seepage dispersion is generally higher in fractures with intermediate fraction of asperity contacts; but it is lower for small or large fractions of asperity contacts. However, it may become higher if the ponding becomes significant. Transport simulations indicate that tracers bypass dead-end pores and travel along flow paths that have less flow resistance. Accordingly, tracer breakthrough curves generally show more spreading than breakthrough curves for water. Further analyses suggest that the log-normal time model generally fails to fit the breakthrough curves for water, but it is a good approximation for breakthrough curves for the tracer.
Dtection de Fautes par Rseaux Baysiens dans les Procds Multivaris
Paris-Sud XI, UniversitÃ© de
between the usual detection methods that are the multivari- ate control charts (Hotelling's T2 , MEWMA Discriminante KEYWORDS: SPC, multivariate, detection, bayesian network, T2 , MEWMA, Discriminant Analy- sis
Accelerated Articles Design and Testing of a Multivariate Optical
Myrick, Michael Lenn
#12;Accelerated Articles Design and Testing of a Multivariate Optical Element: The First Demonstration of Multivariate Optical Computing for Predictive Spectroscopy O. Soyemi, D. Eastwood, L. Zhang, H Street, Suite 102, Lincoln, Nebraska 68508 A demonstration of multivariate optical computing is presented
Situvis: visualising multivariate context information to evaluate situation specifications
Dobson, Simon
Situvis: visualising multivariate context information to evaluate situation specifications Adrian K, highly multivariate, and constantly being updated as new readings are recorded. Situations have been. The visualisation of large and complex multivariate data sets, such as those that context-aware system developers
Asymptotic Theory for Multivariate GARCH and O. Lieberman
Comte, Fabienne
Asymptotic Theory for Multivariate GARCH Processes F. Comte and O. Lieberman Revised, July 2001 Abstract We provide in this paper asymptotic theory for the multivariate GARCH(p, q) process. Strong and ergodic solution to the multivariate GARCH(p, q) process. We prove asymptotic normality of the quasi
32. Statistics 1 32. STATISTICS
Masci, Frank
32. Statistics 1 32. STATISTICS Revised September 2007 by G. Cowan (RHUL). This chapter gives an overview of statistical methods used in High Energy Physics. In statistics, we are interested in using's validity or to determine the values of its parameters. There are two main approaches to statistical
Statistical Laboratory & Department of Statistics
by the American Statistical Association. Dean Isaacson and Mark Kaiser were instrumental in garnering a NationalStatistical Laboratory & Department of Statistics Annual Report July 1, 2002 to June 30, 2003 IOWA Chair of the Department of Statistics and Director of the Statistical Laboratory in November, 2002. Dean
N. Panja; A. K. Chattopadhyay
2014-12-05T23:59:59.000Z
We report results of an experimental study, complemented by detailed statistical analysis of the experimental data, on the development of a more effective control method of drug delivery using a pH sensitive acrylic polymer. New copolymers based on acrylic acid and fatty acid are constructed from dodecyl castor oil and a tercopolymer based on methyl methacrylate, acrylic acid and acryl amide were prepared using this new approach. Water swelling characteristics of fatty acid, acrylic acid copolymer and tercopolymer respectively in acid and alkali solutions have been studied by a step-change method. The antibiotic drug cephalosporin and paracetamol have also been incorporated into the polymer blend through dissolution with the release of the antibiotic drug being evaluated in bacterial stain media and buffer solution. Our results show that the rate of release of paracetamol getss affected by the pH factor and also by the nature of polymer blend. Our experimental data have later been statistically analyzed to quantify the precise nature of polymer decay rates on the pH density of the relevant polymer solvents. The time evolution of the polymer decay rates indicate a marked transition from a linear to a strictly non-linear regime depending on the whether the chosen sample is a general copolymer (linear) or a tercopolymer (non-linear). Non-linear data extrapolation techniques have been used to make probabilistic predictions about the variation in weight percentages of retained polymers at all future times, thereby quantifying the degree of efficacy of the new method of drug delivery.
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01T23:59:59.000Z
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
Early Classification of Multivariate Time Series Using a Hybrid HMM/SVM model
Obradovic, Zoran
Early Classification of Multivariate Time Series Using a Hybrid HMM/SVM model Mohamed F. Ghalwash to use a shorter time interval for classification is often more favorable than having a slightly more with other models that use full time series both in training and testing. Analysis of biomedical data has
Hastie, Trevor
', Journal of the American Statistical Association 89, 1255-- 1270. Hertz, J., Krogh, A. & Palmer, R. (1991. 1994a). References Becker, R., Chambers, J. & Wilks, A. (1988), The New S Language, WadsworthHastie, T., Tibshirani, R. & Buja, A. (1994b), `Flexible discriminant analysis by optimal scoring
Garcia-Villalba, Manuel; Uhlmann, Markus
2012-01-01T23:59:59.000Z
We have performed a direct numerical simulation of dilute turbulent particulate flow in a vertical plane channel, fully resolving the phase interfaces. The flow conditions are the same as those in the main case of "Uhlmann, M., Phys. Fluids, vol. 20, 2008, 053305", with the exception of the computational domain length which has been doubled in the present study. The statistics of flow and particle motion are not significantly altered by the elongation of the domain. The large-scale columnar-like structures which had previously been identified do persist and they are still only marginally decorrelated in the prolonged domain. Voronoi analysis of the spatial particle distribution shows that the state of the dispersed phase can be characterized as slightly more ordered than random tending towards a homogeneous spatial distribution. It is also found that the p.d.f.'s of Lagrangian particle accelerations for wall-normal and spanwise directions follow a lognormal distribution as observed in previous experiments of ...
A foundation for reliable spatial proteomics data analysis
Gatto, Laurent; Breckels, Lisa M.; Burger, Thomas; Nightingale, Daniel J.H.; Groen, Arnoud J.; Campbell, Callum; Mulvey, Claire M.; Christoforou, Andy; Ferro, Myriam; Lilley, Kathryn S.
2014-05-20T23:59:59.000Z
to the same degree as using inappropriate training examples. An important factor to consider in ones choice of training examples i.e. organelle markers, is how well they represent the multivariate data space i.e. the distribution of proteins over which... and insightful biological interpretation, and no consistent and robust solutions have been offered to the community so far. Here, we introduce the requirements for rigorous spatial proteomics data analysis as well as the statistical machine learning methodologies...
Method for factor analysis of GC/MS data
Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R
2012-09-11T23:59:59.000Z
The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.
Statistical Performance Modeling of SRAMs
Zhao, Chang
2011-02-22T23:59:59.000Z
to their characteristic of low failure rate, while statistical method of yield sensitivity analysis is meaningful for its high efficiency. This thesis proposes a novel statistical model to conduct yield sensitivity prediction on SRAM cells at the simulation level, which...
Hitchcock, Adam P.
Mathematics & Statistics Coop Program Students from the Mathematics & Statistics Coop Program have design and data analysis, medical imaging, mathematical finance and statistical modeling. They have of Mathematics & Statistics Coop Work Terms Duties: Performed data mapping and analysis activities Derived
Statistical Convergence and Convergence in Statistics
Mark Burgin; Oktay Duman
2006-12-07T23:59:59.000Z
Statistical convergence was introduced in connection with problems of series summation. The main idea of the statistical convergence of a sequence l is that the majority of elements from l converge and we do not care what is going on with other elements. We show (Section 2) that being mathematically formalized the concept of statistical convergence is directly connected to convergence of such statistical characteristics as the mean and standard deviation. At the same time, it known that sequences that come from real life sources, such as measurement and computation, do not allow, in a general case, to test whether they converge or statistically converge in the strict mathematical sense. To overcome limitations induced by vagueness and uncertainty of real life data, neoclassical analysis has been developed. It extends the scope and results of the classical mathematical analysis by applying fuzzy logic to conventional mathematical objects, such as functions, sequences, and series. The goal of this work is the further development of neoclassical analysis. This allows us to reflect and model vagueness and uncertainty of our knowledge, which results from imprecision of measurement and inaccuracy of computation. In the context on the theory of fuzzy limits, we develop the structure of statistical fuzzy convergence and study its properties.
Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645U.S. DOE Office of Science (SC) EnvironmentalGyroSolÃ©(tm)HydrogenRFP Â» ImportantOfficeofofIncreasing water
Independent Statistics & Analysis
Annual Energy Outlook 2013 [U.S. Energy Information Administration (EIA)]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645 3,625 1,006 492 742 33 111 1,613 122 40 Buildingto17 3400, U.S.MajorMarketsNov-14 Dec-14Has| Methodology forOctober
Office of Environmental Management (EM)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645 3,625 1,006 492 742 33Frequently AskedEnergyIssues DOE's NuclearSpurring SolarSystem,Department ofTechnologyLM
Multivariate Lipschitz optimization: Survey and computational comparison
Hansen, P.; Gourdin, E.; Jaumard, B.
1994-12-31T23:59:59.000Z
Many methods have been proposed to minimize a multivariate Lipschitz function on a box. They pertain the three approaches: (i) reduction to the univariate case by projection (Pijavskii) or by using a space-filling curve (Strongin); (ii) construction and refinement of a single upper bounding function (Pijavskii, Mladineo, Mayne and Polak, Jaumard Hermann and Ribault, Wood...); (iii) branch and bound with local upper bounding functions (Galperin, Pint{acute e}r, Meewella and Mayne, the present authors). A survey is made, stressing similarities of algorithms, expressed when possible within a unified framework. Moreover, an extensive computational comparison is reported on.
Essays on Multivariate Modeling in Financial Econometrics
Yoldas, Emre
2008-01-01T23:59:59.000Z
24 5.1 GARCHTESTING IN MULTIVARIETE GARCH MODELS . . . . . . . . . . .t and J-Statistics for GARCH (1,1) Model of NYSE Returns
Multivariable controller increased MTBE complex capacity
Robertson, D.; Peterson, T.J.; O`Connor, D. [DMC Corp., Houston, TX (United States); Payne, D.; Adams, V. [Valero Refining Co., Corpus Christi, TX (United States)
1997-03-01T23:59:59.000Z
Capacity increased by more than 4.6% when one dynamic matrix multivariable controller began operating in Valero Refining Company`s MTBE production complex in Corpus Christi, Texas. This was on a plant that was already running well above design capacity due to previously made process changes. A single controller was developed to cover an isobutane dehydrogenation (ID) unit and an MTBE reaction and fractionation plant with the intermediate isobutylene surge drum. The overall benefit is realized by a comprehensive constrained multivariable predictive controller that properly handles all sets of limits experienced by the complex, whether limited by the front-end ID or back-end MTBE units. The controller has 20 manipulated, 6 disturbance and 44 controlled variables, and covers widely varying dynamics with settling times ranging from twenty minutes to six hours. The controller executes each minute with a six hour time horizon. A unique achievement is intelligent surge drum level handling by the controller for higher average daily complex capacity as a whole. The ID unit often operates at simultaneous limits on reactor effluent compressor capacity, cold box temperature and hydrogen/hydrocarbon ratio, and the MTBE unit at impurity in butene column overhead as well as impurity in MTBE product. The paper discusses ether production, isobutane dehydrogenation, maximizing production, controller design, and controller performance.
American Statistical Association National Science Foundation
Bermúdez, José Luis
American Statistical Association National Science Foundation U.S. Census Bureau ASA American Statistical Association 732 North Washington Street Alexandria, VA 22314-1943 U.S. government and analysis, statistical methodology and computing, information and behavioral science, and geography
Term statistics Zipf's law text statistics
Lu, Jianguo
Term statistics Zipf's law text statistics October 20, 2014 text statistics 1 / 19 #12;Term statistics Zipf's law Overview 1 Term statistics 2 Zipf's law text statistics 2 / 19 #12;Term statistics Zipf's law Outline 1 Term statistics 2 Zipf's law text statistics 3 / 19 #12;Term statistics Zipf's law Model
Cut Generation for Optimization Problems with Multivariate Risk ...
Simge Küçükyavuz
2014-07-26T23:59:59.000Z
Jul 26, 2014 ... Cut Generation for Optimization Problems with Multivariate Risk Constraints. Simge Küçükyavuz (kucukyavuz.2 ***at*** osu.edu) Nilay Noyan ...
Lamprecht, William Otto
1967-01-01T23:59:59.000Z
and understanding which made it ail worth while. TABLE OF CONTENTS Page ACKNOWLEDGMENTS LIST OF TABLES LIST OF FIGURES V Vl Chapter I. INTRODUCTIOiN II. EXPERIMENTAL PROCEDURE III. MULTIPLE REGRFSSION ANALYSIS ON DIETARY FEED INTAKE IV. DISCRIMINANT... Including Periods 8. Discriminant Analysis - Preliminary Analysis (17 Discriminants) 28 35 9. Oiscriminant Analysis Lambda \\lalues and Analysis of Variance . 38 10. Discriminant Analysis Lambda Values and Analysis of Variance on Log Transformed Data...
Paul, Satyakama; Marwala, Tshilidzi
2011-01-01T23:59:59.000Z
South Africa assumes a significant position in the insurance landscape of Africa. The present research based upon qualitative and quantitative analysis, shows that it shows the characteristics of a Complex Adaptive System. In addition, a statistical analysis of risk measures through Value at risk and Conditional tail expectation is carried out to show how an individual insurance company copes under external complexities. The authors believe that an explanation of the coping strategies, and the subsequent managerial implications would enrich our understanding of complexity in business.
A multivariate quadrature based moment method for supersonic combustion modeling
Raman, Venkat
A multivariate quadrature based moment method for supersonic combustion modeling Pratik Donde) of thermochemical variables can be used for accurately computing the combustion source term. Quadrature based- ture method of moments (DQMOM) is well suited for multivariate problems like combustion. Numerical
Multivariable Discrete Time Repetitive Control System Hammoud Saari1
Boyer, Edmond
Multivariable Discrete Time Repetitive Control System Hammoud Saari1 and Bernard Caron2 1 SETRAM, France hammoud.saari@yahoo.fr, bernard.caron@univ-savoie.fr Keywords: Repetitive Control, Multivariable), Ahn et al. (2007) and Saari et al. (2010)). Most of their works were focused on the problem
Multivariate wavelet kernel regression method Samir Touzani, Daniel Busbya
Paris-Sud XI, Université de
Multivariate wavelet kernel regression method Samir Touzani, Daniel Busbya a IFP Energies nouvelles multivariate nonparametric regression method, in the framework of wavelet decomposition. We call this method the wavelet kernel ANOVA (WK-ANOVA), which is a wavelet based reproducing kernel Hilbert space (RKHS) method
FAULT DETECTION IN A MULTIVARIATE PROCESS WITH A BAYESIAN NETWORK
Boyer, Edmond
with simulations in order to analyze and compare his performances to other multivariate chart TÂ² of Hotelling and MEWMA. Keywords: SPC, multivariate, detection, bayesian network, TÂ², MEWMA 1 Â Introduction Control charts have been widely used in industry. The aim is to monitor the centering and the scattering
Robust Controller Design and PID Tuning for Multivariable Processes
Marquez, Horacio J.
Robust Controller Design and PID Tuning for Multivariable Processes Wen Tan Department approach, which gives a sub-optimal solution. Then a PID approximation method is proposed to reduce a high it serves as a PID tuning method for multivariable processes. Examples show that the method is easy to use
A Multivariate Probabilistic Method for Comparing Two Clinical Datasets
Hauskrecht, Milos
A Multivariate Probabilistic Method for Comparing Two Clinical Datasets Yuriy Sverchkov yus24@pitt a concise and math- ematically grounded description of multivariate differences between a pair of clinical (ICUs), or within the same ICU during different periods, may show systematically different outcomes
Residential solar home resale analysis
Noll, S.A.
1980-01-01T23:59:59.000Z
One of the determinants of the market acceptance of solar technologies in the residential housing sector is the value placed upon the solar property at the time of resale. The resale factor is shown to be an important economic parameter when net benefits of the solar design are considered over a typical ownership cycle rather than the life cycle of the system. Although a study of solar resale in Davis, Ca, indicates that those particular homes have been appreciating in value faster than nonsolar market comparables, no study has been made that would confirm this conclusion for markets in other geograhical locations with supporting tests of statistical significance. The data to undertake such an analysis is available through numerous local sources; however, case by case data collection is prohibitively expensive. A recommended alternative approach is to make use of real estate market data firms who compile large data bases and provide multi-variate statistical analysis packages.
acid sequence analysis: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
sequences. The starting Radicioni, Daniele 26 A Method for Sequence Analysis Using Multivariate Analysis CiteSeer Summary: We developed a computational sequence analysis...
Chang, Wen-Kuei; Hong, Tianzhen
2013-01-01T23:59:59.000Z
Occupancy profile is one of the driving factors behind discrepancies between the measured and simulated energy consumption of buildings. The frequencies of occupants leaving their offices and the corresponding durations of absences have significant impact on energy use and the operational controls of buildings. This study used statistical methods to analyze the occupancy status, based on measured lighting-switch data in five-minute intervals, for a total of 200 open-plan (cubicle) offices. Five typical occupancy patterns were identified based on the average daily 24-hour profiles of the presence of occupants in their cubicles. These statistical patterns were represented by a one-square curve, a one-valley curve, a two-valley curve, a variable curve, and a flat curve. The key parameters that define the occupancy model are the average occupancy profile together with probability distributions of absence duration, and the number of times an occupant is absent from the cubicle. The statistical results also reveal that the number of absence occurrences decreases as total daily presence hours decrease, and the duration of absence from the cubicle decreases as the frequency of absence increases. The developed occupancy model captures the stochastic nature of occupants moving in and out of cubicles, and can be used to generate a more realistic occupancy schedule. This is crucial for improving the evaluation of the energy saving potential of occupancy based technologies and controls using building simulations. Finally, to demonstrate the use of the occupancy model, weekday occupant schedules were generated and discussed.
Mathematics and statistics research department. Progress report, period ending June 30, 1981
Lever, W.E.; Kane, V.E.; Scott, D.S.; Shepherd, D.E.
1981-09-01T23:59:59.000Z
This report is the twenty-fourth in the series of progress reports of the Mathematics and Statistics Research Department of the Computer Sciences Division, Union Carbide Corporation - Nuclear Division (UCC-ND). Part A records research progress in biometrics research, materials science applications, model evaluation, moving boundary problems, multivariate analysis, numerical linear algebra, risk analysis, and complementary areas. Collaboration and consulting with others throughout the UCC-ND complex are recorded in Part B. Included are sections on biology and health sciences, chemistry, energy, engineering, environmental sciences, health and safety research, materials sciences, safeguards, surveys, and uranium resource evaluation. Part C summarizes the various educational activities in which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.
The University of Chicago Department of Statistics
Stephens, Matthew
testing for presence of signal in the brain. It also arises naturally in multivariate analysis:00 PM 133 Eckhart Hall, 5734 S. University Avenue ABSTRACT We consider the well-studied problem of Euclidean space, or, more generally an abstract manifold, with or without boundary. This problem arises
Application of a single multivariable controller to an FCCU
Cutler, C.R.; Johnston, C.R.; Raven, D.B. (Dynamic Matrix Control Corp., Houston, TX (United States)); Eakens, R.W.; Koepke, J. (Star Enterprise, Port Arthur, TX (United States)); Alrushaid, N. (Saudi ARAMCO, Ras Tanura (Saudi Arabia))
1993-01-01T23:59:59.000Z
A number of significant benefits are realized from the design and operation of a single multivariable controller for a Fluid Catalytic Cracking Unit. A single controller has been built for the Star Enterprise FCCU No. 3 in Port Arthur, Texas. The controller includes the Feed Preheat System, the Reactor, the Regenerator, the Main Fractionator, and the Wet Gas Compression. The controller contains 17 manipulated variables, 41 controlled variables, and 1 disturbance variable. The elapse time between project initiation and final commissioning was four months including a one month delay for unit maintenance. After commissioning, it was determined in the post audit that the simple payout for the project was less than one month. The controller has maintained a high stream factor since its commissioning 8 months ago. The single large scale controller improves the reliability of the control system, permits the handling of all the interactions between independent variables, removes stability analysis from the controller design, increases the ability of the controller to address the economics of the operation, and increases the maintainability of the system relative to traditional heuristic combinations of PID controllers.
Practical Identifiability of Finite Mixtures of Multivariate Bernoulli Distributions
Carreira-Perpinan, Miguel A; Renals, Steve
The class of finite mixtures of multivariate Bernoulli distributions is known to be nonidentifiable; that is, different values of the mixture parameters can correspond to exactly the same probability distribution. In ...
Multi-variable optimization of pressurized oxy-coal combustion
Zebian, Hussam
2011-01-01T23:59:59.000Z
Simultaneous multi-variable gradient-based optimization with multi-start is performed on a 300 MWe wet-recycling pressurized oxy-coal combustion process with carbon capture and sequestration. The model accounts for realistic ...
1979 DOE statistical symposium
Gardiner, D.A.; Truett T. (comps. and eds.)
1980-09-01T23:59:59.000Z
The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.
Callen, Elisabeth F.
2012-12-31T23:59:59.000Z
Mesoscale Convective Systems (MCSs) are the focus of this analysis since it is the convective weather category which is smallest in number but produces the highest amount of precipitation. Being able to forecast these MCSs ...
Arrowood, L.F.; Tonn, B.E.
1992-02-01T23:59:59.000Z
This report presents recommendations relative to the use of expert systems and machine learning techniques by the Bureau of Labor Statistics (BLS) to substantially automate product substitution decisions associated with the Consumer Price Index (CPI). Thirteen commercially available, PC-based expert system shells have received in-depth evaluations. Various machine learning techniques were also reviewed. Two recommendations are given: (1) BLS should use the expert system shell LEVEL5 OBJECT and establish a software development methodology for expert systems; and (2) BLS should undertake a small study to evaluate the potential of machine learning techniques to create and maintain the approximately 350 ELI-specific knowledge bases to be used in CPI product substitution review.
On the Number of Tight Wavelet Frame Generators associated with Multivariate Box Splines
Lai, Ming-Jun
Kronecker poduct method in [2]. Keywords Sum of square magnitudes, Multivariate box splines, Tight wavelet
Wavelet analysis of the multivariate fractional Brownian motion
Paris-Sud XI, Université de
,2 , Pierre-Olivier Amblard2,3 and Sophie Achard2 1 Laboratory Jean Kuntzmann, Grenoble University, France, 2, 2009; Achard et al., 2008, 2006). In all these disciplines, many modern sensing approaches allow
A Multivariate Analysis of Freeway Speed and Headway Data
Zou, Yajie
2013-11-11T23:59:59.000Z
key process is the generation of entry vehicle speeds and vehicle arrival times. It is helpful to find desirable mathematical distributions to model individual speed and headway values, because the individual vehicle speed and arrival time...
Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis
Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.
2011-10-17T23:59:59.000Z
Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.
Four Faculty Positions Applied Statistics & Computational Statistics
Shepp, Larry
Four Faculty Positions Applied Statistics & Computational Statistics The Department of Statistics at the Assistant Professor rank. Two positions are open in the area of Applied Statistics, with a focus on the development of statistical methodology and statistical consulting. The other two positions are open
The University of Chicago Department of Statistics
The University of Chicago Department of Statistics Seminar Series STEFFEN LAURITZEN Department of Statistics University of Oxford Bayesian Networks for the Analysis of DNA Mixtures MONDAY, May 21, 2009, at 4
analysis-a potential functional: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
the study) to IR. Functional Data Analysis (FDA) 5 is an extension of traditional Data Analysis multivariate analysis and FDA is that the latter manipulates the functional...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level:Energy: Grid Integration Redefining What'sis Taking Over Our Instagram Secretary900 SpecialNanoparticulateEmissions Targetson6 Statistical
analysis structural integrity: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
processes 6, rotation 1 see e.g. the remarks in 2 1 12; WHITE NOISE ANALYSIS 113 Multivariate Analysis and Geovisualization with an Integrated Geographic Knowledge...
NSTec Environmental Restoration
2009-04-20T23:59:59.000Z
A statistical analysis and geologic evaluation of recently acquired laboratory-derived physical property data are being performed to better understand and more precisely correlate physical properties with specific geologic parameters associated with non-zeolitized tuffs at the Nevada Test Site. Physical property data include wet and dry bulk density, grain density (i.e., specific gravity), total porosity, and effective porosity. Geologic parameters utilized include degree of welding, lithology, stratigraphy, geographic area, and matrix mineralogy (i.e., vitric versus devitrified). Initial results indicate a very good correlation between physical properties and geologic parameters such as degree of welding, lithology, and matrix mineralogy. However, physical properties appear to be independent of stratigraphy and geographic area, suggesting that the data are transferrable with regards to these two geologic parameters. Statistical analyses also indicate that the assumed grain density of 2.65 grams per cubic centimeter used to calculate porosity in some samples is too high. This results in corresponding calculated porosity values approximately 5 percent too high (e.g., 45 percent versus 40 percent), which can be significant in the lower porosity rocks. Similar analyses and evaluations of zeolitic tuffs and carbonate rock physical properties data are ongoing as well as comparisons to geophysical log values.
Parallel auto-correlative statistics with VTK.
Pebay, Philippe Pierre [Kitware, France; Bennett, Janine Camille
2013-08-01T23:59:59.000Z
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Mueller, Amy V.
In this paper a second-order method for blind source separation of noisy instantaneous linear mixtures is presented for the case where the signal order k is unknown. Its performance advantages are illustrated by simulations ...
Cook, Di
Effects Research Laboratory, Western Ecology Division, Corvallis, OR 97333 5 OAO, c/o USEPA NHEERL Western Ecology Division, Corvallis, OR 97333 Abstract This paper discusses the extent of virtual reality technology and raises an example of using a highly immersive environment for exploring and mining
2001, Applied Statistics, 50, 143-154. Nonlinear autoregressive time series with multivariate
Stone, J. V.
and Collares Pereira, 1992; Beyer et al., 1995, models for mean instantaneous sky luminance distribution have a value greater than unity when the sky is clear and a value less than unity when the sky is cloudy. Its
Multivariate Statistics of the Jacobian Matrices in Tensor Based Morphometry and Their
Thompson, Paul
to HIV/AIDS Natasha Lepore1 , Caroline A. Brun1 , Ming-Chang Chiang1 , Yi-Yu Chou1 , Rebecca A. Dutton1 , and Paul M. Thompson1 1 Laboratory of Neuro Imaging, Department of Neurology, David Geffen School
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Tenneti, Ananth [Carnegie Mellon Univ., Pittsburgh, PA (United States); Khandai, Nishikanta [Brookhaven National Lab. (BNL), Upton, NY (United States); National Inst. of Science Education and Research, Bhubaneswar, Odish (India); Singh, Sukhdeep [Carnegie Mellon Univ., Pittsburgh, PA (United States); Mandelbaum, Rachel [Carnegie Mellon Univ., Pittsburgh, PA (United States); Di Matteo, Tiziana [Carnegie Mellon Univ., Pittsburgh, PA (United States); Feng, Yu [Carnegie Mellon Univ., Pittsburgh, PA (United States)
2015-03-04T23:59:59.000Z
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (wg?) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensor but that luminosity versus mass weighting has only negligible effects. Both ED and wg? correlations increase in amplitude with subhalo mass (in the range of 10¹? – 6.0 X 10¹?h?¹ M), with a weak redshift dependence (from z – 1 to z – 0.06) at fixed mass. At z ~ 0.3, we predict a wg? that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5–18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that wg? (using subhalos as tracers of density and w? (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.
Statistics applied to safeguards
Picard, R.R.
1993-05-01T23:59:59.000Z
Statistical methods are central to safeguards work. Measurements forming the basis of much materials accountancy are not perfect - ``perfect`` in the sense of being error free. Other sessions in this course address the destructive and nondestructive measurement of nuclear material, together with the inherent limitations in those measurements. The bottom line is that measurement errors are a fact of life and, since we can`t eliminate them, we have to find a rational way to deal with them. Which leads to the world of statistics. Beyond dealing with measurement errors, another area of statistical application involves the sampling of items for verification. Inspectors from the IAEA and domestic regulatory agencies periodically visit operating facilities and make measurements of selected items. By comparing their own measured values to those declared by the facilities, increased confidence is obtained. If verification measurements were not expensive, time consuming, and disruptive to operations, perhaps verification of 100% of the inventories would be desirable. In reality, many constraints lead to inspection of only a portion of those inventories. Drawing inferences about a larger ``population`` of declared items in a facility based on verification information obtained from a sample of those items is a statistical problem. There are few texts on statistics in safeguards. The lengthy exposition ``IAEA Safeguards: Statistical Concepts and Techniques`` and the US NRC book edited by Bowen and Bennet are two good sources of general information. In the next section, the subject of measurement quality is addressed. The third section covers the evaluation of MUFs, and discusses the related subjects of error propagation and sequential analysis. The final section covers verification, inspection sample size calculations, and the D statistic. The text is written at an elementary level, with references to the safeguards literature for more detailed treatment.
Statistics applied to safeguards
Picard, R.R.
1993-01-01T23:59:59.000Z
Statistical methods are central to safeguards work. Measurements forming the basis of much materials accountancy are not perfect - perfect'' in the sense of being error free. Other sessions in this course address the destructive and nondestructive measurement of nuclear material, together with the inherent limitations in those measurements. The bottom line is that measurement errors are a fact of life and, since we can't eliminate them, we have to find a rational way to deal with them. Which leads to the world of statistics. Beyond dealing with measurement errors, another area of statistical application involves the sampling of items for verification. Inspectors from the IAEA and domestic regulatory agencies periodically visit operating facilities and make measurements of selected items. By comparing their own measured values to those declared by the facilities, increased confidence is obtained. If verification measurements were not expensive, time consuming, and disruptive to operations, perhaps verification of 100% of the inventories would be desirable. In reality, many constraints lead to inspection of only a portion of those inventories. Drawing inferences about a larger population'' of declared items in a facility based on verification information obtained from a sample of those items is a statistical problem. There are few texts on statistics in safeguards. The lengthy exposition IAEA Safeguards: Statistical Concepts and Techniques'' and the US NRC book edited by Bowen and Bennet are two good sources of general information. In the next section, the subject of measurement quality is addressed. The third section covers the evaluation of MUFs, and discusses the related subjects of error propagation and sequential analysis. The final section covers verification, inspection sample size calculations, and the D statistic. The text is written at an elementary level, with references to the safeguards literature for more detailed treatment.
Fitelson, Branden
Branden Fitelson Remarks on the Philosophy of Statistics 0 SOME REMARKS ON THE PHILOSOPHY OF STATISTICS BRANDEN FITELSON Department of Philosophy San Jos´e State University branden@fitelson.org http of Statistics 1 Overview of Presentation · What are the ends of statistical experiment, analysis
J. Mark Heinzle; Claes Uggla
2012-12-21T23:59:59.000Z
In this paper we explore stochastical and statistical properties of so-called recurring spike induced Kasner sequences. Such sequences arise in recurring spike formation, which is needed together with the more familiar BKL scenario to yield a complete description of generic spacelike singularities. In particular we derive a probability distribution for recurring spike induced Kasner sequences, complementing similar available BKL results, which makes comparisons possible. As examples of applications, we derive results for so-called large and small curvature phases and the Hubble-normalized Weyl scalar.
David, Mathieu; Garde, Francois; Boyer, Harry
2014-01-01T23:59:59.000Z
In building studies dealing about energy efficiency and comfort, simulation software need relevant weather files with optimal time steps. Few tools generate extreme and mean values of simultaneous hourly data including correlation between the climatic parameters. This paper presents the C++ Runeole software based on typical weather sequences analysis. It runs an analysis process of a stochastic continuous multivariable phenomenon with frequencies properties applied to a climatic database. The database analysis associates basic statistics, PCA (Principal Component Analysis) and automatic classifications. Different ways of applying these methods will be presented. All the results are stored in the Runeole internal database that allows an easy selection of weather sequences. The extreme sequences are used for system and building sizing and the mean sequences are used for the determination of the annual cooling loads as proposed by Audrier-Cros (Audrier-Cros, 1984). This weather analysis was tested with the datab...
Multivariate classification of infrared spectra of cell and tissue samples
Haaland, David M. (Albuquerque, NM); Jones, Howland D. T. (Albuquerque, NM); Thomas, Edward V. (Albuquerque, NM)
1997-01-01T23:59:59.000Z
Multivariate classification techniques are applied to spectra from cell and tissue samples irradiated with infrared radiation to determine if the samples are normal or abnormal (cancerous). Mid and near infrared radiation can be used for in vivo and in vitro classifications using at least different wavelengths.
Fast and Flexible Multivariate Time Series Subsequence Search Kanishka Bhaduri
Oza, Nikunj C.
search algorithm capable of subsequence search on any subset of variables. Moreover, MTS subsequence approach" may include searching on parameters such as speed, descent rate, vertical flight pathFast and Flexible Multivariate Time Series Subsequence Search Kanishka Bhaduri MCT Inc., NASA ARC
PENALIZED MULTIVARIATE LOGISTIC REGRESSION WITH A LARGE DATA SET
Liblit, Ben
-linear model to build a partly exible model for multivariate Bernoulli data. The joint distribution the association between outcome variables. A numer- ical scheme based on the block one-step SOR-Leibler) distance. It is used to adaptively select smoothing parameters in each block one-step SOR itera- tion
Pareto efficiency for the concave order and multivariate comonotonicity
Paris-Sud XI, Université de
Pareto efficiency for the concave order and multivariate comonotonicity G. Carlier , R.-A. Dana , A. Galichon April 7, 2010 Abstract This paper studues efficient risk-sharing rules for the concave to Landsberger and Meilijson [28], that efficiency is characterized by a comonotonicity condition. The goal
ess5011 Robert J. Serfling MULTIVARIATE SYMMETRY AND
Serfling, Robert
ess5011 Robert J. Serfling MULTIVARIATE SYMMETRY AND ASYMMETRY Robert J. Serfling University by modern group theory. 1 #12;ess5011 Robert J. Serfling Here we focus on the notion of symmetry and s independently distributed as chi-square with m degrees of freedom. 2 #12;ess5011 Robert J. Serfling An important
A new sliced inverse regression method for multivariate response regression
reduction (EDR) space without requiring a prespecified parametric model. The convergence at rate n of the estimated EDR space is shown. We discuss the choice of the dimension of the EDR space. The numerical a way to cluster components of y related to the same EDR space. One can thus apply properly multivariate
Generalizations of quantum statistics
O. W. Greenberg
2008-05-02T23:59:59.000Z
We review generalizations of quantum statistics, including parabose, parafermi, and quon statistics, but not including anyon statistics, which is special to two dimensions.
18.441 Statistical Inference, Spring 2002
Hardy, Michael
Reviews probability and introduces statistical inference. Point and interval estimation. The maximum likelihood method. Hypothesis testing. Likelihood-ratio tests and Bayesian methods. Nonparametric methods. Analysis of ...
Essays on microeconomics and statistical decision making
Nieto Barthaburu, Augusto
2006-01-01T23:59:59.000Z
An Introduction to Econometric Theory. Princeton Universityand S. Low (1989): An Econometric Analysis of the Bankextensive statistical and econometric literature concerned
Demkin, V. P.; Mel'nichuk, S. V. [National Research Tomsk State University, 36, Lenin Ave., 634050 Tomsk (Russian Federation)
2014-09-15T23:59:59.000Z
In the present work, results of investigations into the dynamics of secondary electrons with helium atoms in the presence of the reverse electric field arising in the flare of a high-voltage pulsed beam-type discharge and leading to degradation of the primary electron beam are presented. The electric field in the discharge of this type at moderate pressures can reach several hundred V/cm and leads to considerable changes in the kinetics of secondary electrons created in the process of propagation of the electron beam generated in the accelerating gap with a grid anode. Moving in the accelerating electric field toward the anode, secondary electrons create the so-called compensating current to the anode. The character of electron motion and the compensating current itself are determined by the ratio of the field strength to the concentration of atoms (E/n). The energy and angular spectra of secondary electrons are calculated by the Monte Carlo method for different ratios E/n of the electric field strength to the helium atom concentration. The motion of secondary electrons with threshold energy is studied for inelastic collisions of helium atoms and differential analysis is carried out of the collisional processes causing energy losses of electrons in helium for different E/n values. The mechanism of creation and accumulation of slow electrons as a result of inelastic collisions of secondary electrons with helium atoms and selective population of metastable states of helium atoms is considered. It is demonstrated that in a wide range of E/n values the motion of secondary electrons in the beam-type discharge flare has the character of drift. At E/n values characteristic for the discharge of the given type, the drift velocity of these electrons is calculated and compared with the available experimental data.
Spatial Autocorrelation and Statistical Tests: Some Solutions
Fortin, Marie Josee
.fortin@utoronto.ca). 188 © 2009 American Statistical Association and the International Biometric Society JournalSpatial Autocorrelation and Statistical Tests: Some Solutions Mark R. T. DALE and Marie problem in analysis, affecting the significance rates of statistical tests, making them too liberal when
Multivariate SPC for Total Inertial Tolerancing Maurice Pillet / Boukar Abdelhakim / Eric Pairel /
Paris-Sud XI, UniversitÃ© de
Multivariate SPC for Total Inertial Tolerancing Maurice Pillet / Boukar Abdelhakim / Eric Pairel by minimizing the inertia of the surfaces. The proposed method is based on multivariate SPC. For a given surface
Paul T. Baker; Sarah Caudill; Kari A. Hodge; Dipongkar Talukder; Collin Capano; Neil J. Cornish
2014-12-19T23:59:59.000Z
Searches for gravitational waves produced by coalescing black hole binaries with total masses $\\gtrsim25\\,$M$_\\odot$ use matched filtering with templates of short duration. Non-Gaussian noise bursts in gravitational wave detector data can mimic short signals and limit the sensitivity of these searches. Previous searches have relied on empirically designed statistics incorporating signal-to-noise ratio and signal-based vetoes to separate gravitational wave candidates from noise candidates. We report on sensitivity improvements achieved using a multivariate candidate ranking statistic derived from a supervised machine learning algorithm. We apply the random forest of bagged decision trees technique to two separate searches in the high mass $\\left( \\gtrsim25\\,\\mathrm{M}_\\odot \\right)$ parameter space. For a search which is sensitive to gravitational waves from the inspiral, merger, and ringdown (IMR) of binary black holes with total mass between $25\\,$M$_\\odot$ and $100\\,$M$_\\odot$, we find sensitive volume improvements as high as $70_{\\pm 13}-109_{\\pm 11}$\\% when compared to the previously used ranking statistic. For a ringdown-only search which is sensitive to gravitational waves from the resultant perturbed intermediate mass black hole with mass roughly between $10\\,$M$_\\odot$ and $600\\,$M$_\\odot$, we find sensitive volume improvements as high as $61_{\\pm 4}-241_{\\pm 12}$\\% when compared to the previously used ranking statistic. We also report how sensitivity improvements can differ depending on mass regime, mass ratio, and available data quality information. Finally, we describe the techniques used to tune and train the random forest classifier that can be generalized to its use in other searches for gravitational waves.
Intersplines: A New Approach to Globally Optimal Multivariate Splines Using Interval
Kearfott, R. Baker
- proximators are the multivariate simplex B-splines. Multivariate simplex B-splines consist of Bernstein basis polynomials that are defined on a ge- ometrical structure called a triangulation. Multivariate simplex B. Secondly, the simplex spline models are parametric models, which allows for effi- cient approximation
Distribution Free Decomposition of Multivariate Data \\Lambda SPR'98 Invited submission
neighbor method, kernel estimation, [8, 16, 18, 19]. For higher dimensional feature spaces, multivariateDistribution Free Decomposition of Multivariate Data \\Lambda SPR'98 Invited submission Dorin; Distribution Free Decomposition of Multivariate Data Abstract We present a practical approach to nonparametric
15.075 Applied Statistics, Spring 2003
Newton, Elizabeth
This course is an introduction to applied statistics and data analysis. Topics include collecting and exploring data, basic inference, simple and multiple linear regression, analysis of variance, nonparametric methods, and ...
Statistical Network Analysis: Models, Issues,
Fienberg, Stephen E.
David M. Blei Stephen E. Fienberg Anna Goldenberg Eric P. Xing Alice X. Zheng #12;#12;Contents Preface University), David M. Blei (Princeton), Stephen E. Fienberg (Carnegie Mellon University), Eric P. Xing
Ge, Shuzhi Sam
, "Multivariate stochastic approximation using a simultaneous perturbation gradient approximation," IEEE Trans., vol. 19, no. 4, pp. 482492, 1998. [18] , "Adaptive stochastic approximation by the simultaneous and E. K. P. Chong, "A deterministic analysis of stochastic ap- proximation with randomized directions
FISHERY STATISTICS UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1972 STATISTICAL DIGEST NO. 66 Prepared by STATISTICS;ACKNOWLEDGMENTS The data in this edition of "Fishery Statistics of the United States" were collected in co- operation with the various States and tabulated by the staff of the Statistics and Market News Division
analysis sia-chemiluminescence determination: Topics by E-print...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
variables, which are responsible for population growth in Rajasthan with the help of multivariate analysis. Singh, V V; Sharma, Neetish; Smarandache, Florentin 2010-01-01 26...
analysis model tsam: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
words Krzysztof Radzicki; Stphane Bonelli 2012-01-01 293 Generalized Impulse Response Analysis in Linear Multivariate Models CiteSeer Summary: Building on Koop, Pesaran and...
asymptotic analysis polarization: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
solutions discussed here. James Isenberg; Vince Moncrief 2002-03-12 11 Asymptotic Analysis of Multivariate Tail Conditional Expectations Mathematics Websites Summary:...
Römisch, Werner
MANAGEMENT 1 Generation of Multivariate Scenario Trees to Model Stochasticity in Power Management Holger data of EDF Electricit´e de France. Index Terms-- Stochastic programming, power management, scenarioHEITSCH, R ¨OMISCH -- GENERATION OF MULTIVARIATE SCENARIO TREES TO MODEL STOCHASTICITY IN POWER
Optimization Online - A data-driven, distribution-free, multivariate ...
Pavithra Harsha
2015-06-03T23:59:59.000Z
Jun 3, 2015 ... Furthermore, many other drivers besides price must be included in the demand response model for statistical accuracy, along with conditional ...
State Space Reconstruction for Multivariate Time Series Prediction
I. Vlachos; D. Kugiumtzis
2008-09-12T23:59:59.000Z
In the nonlinear prediction of scalar time series, the common practice is to reconstruct the state space using time-delay embedding and apply a local model on neighborhoods of the reconstructed space. The method of false nearest neighbors is often used to estimate the embedding dimension. For prediction purposes, the optimal embedding dimension can also be estimated by some prediction error minimization criterion. We investigate the proper state space reconstruction for multivariate time series and modify the two abovementioned criteria to search for optimal embedding in the set of the variables and their delays. We pinpoint the problems that can arise in each case and compare the state space reconstructions (suggested by each of the two methods) on the predictive ability of the local model that uses each of them. Results obtained from Monte Carlo simulations on known chaotic maps revealed the non-uniqueness of optimum reconstruction in the multivariate case and showed that prediction criteria perform better when the task is prediction.
A Visual Analytics Approach for Correlation, Classification, and Regression Analysis
Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)
2013-01-01T23:59:59.000Z
New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today s increasing complex, multivariate data sets. In this paper, a visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today s data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. This chapter provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.
A Visual Analytics Approach for Correlation, Classification, and Regression Analysis
Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)
2012-02-01T23:59:59.000Z
New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.
ISpace: Interactive Volume Data Classification Techniques Using Independent Component Analysis
Ma, Kwan-Liu
, multivariate data analysis, multimodality data, scientific visualization, seg- mentation, volume rendering 1ISpace: Interactive Volume Data Classification Techniques Using Independent Component Analysis, which uses Independent Component Analysis (ICA) and a multi- dimensional histogram of the volume data
Multivariable decoupled longitudinal and lateral vehicle control: A model-free design
Multivariable decoupled longitudinal and lateral vehicle control: A model-free design Lghani model-free control is ap- plied to a multivariable decoupled longitudinal and lateral ve- hicle control and steering angle). It yields driving maneuvers requiring a control coordination of steering angle, braking
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity Searches
Chi, Ed Huai-hsin
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity of other variÂ ables. We present an enhanced system for interactive exploration of this multivariate data. We identify a larger set of useful variables in the information space. The new system involves more
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity Searches
Chi, Ed Huai-hsin
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity- ables. We present an enhanced system for interactive exploration of this multivariate data. We identify a larger set of useful variables in the information space. The new system involves more variables, so
Wavelet Bi-frames with few Generators from Multivariate Refinable Functions
Ehler, Martin
Wavelet Bi-frames with few Generators from Multivariate Refinable Functions Martin Ehler Bin Han compactly supported wavelet bi-frames with few generators from almost any pair of compactly sup- ported multivariate refinable functions. In our examples, we focus on wavelet bi-frames whose primal and dual wavelets
Vine Copulas as a Way to Describe and Analyze Multi-Variate Dependence in Econometrics
Kreinovich, Vladik
Vine Copulas as a Way to Describe and Analyze Multi-Variate Dependence in Econometrics and analyzing multi-variate dependence in econometrics; see, e.g., [13, 7, 911, 13, 14, 21]. Our experience problems of econometrics, there is still a lot of confusion and misunderstanding related to vine copulas
Bayesian network for the characterization of faults in a multivariate process
Paris-Sud XI, Université de
Bayesian network for the characterization of faults in a multivariate process Sylvain VERRON of multivariate processes, we propose an original network structure allowing deciding if a fault is appeared in the process. More, this structure permits the identification of the variables that are responsible (root
Statistical Language Modelling
Gotoh, Yoshihiko; Renals, Steve
2003-01-01T23:59:59.000Z
the underlying models from large amounts of data. Importantly, such statistical approaches often produce useful results. Statistical approaches seem especially well-suited to spoken language which is often spontaneous or conversational and not readily amenable...
Interpreting Accident Statistics
Ferreira, Joseph Jr.
Accident statistics have often been used to support the argument that an abnormally small proportion of drivers account for a large proportion of the accidents. This paper compares statistics developed from six-year data ...
Statistics 36-756: Advanced Statistics II Syllabus: Fall, 2006
Fienberg, Stephen E.
, Journal of the American Statistical Association, Journal of the Royal Statistical Society, StatisticalStatistics 36-756: Advanced Statistics II Syllabus: Fall, 2006 Instructor: Stephen E. Fienberg 132G: · To consider major topics from statistical theory and the foundations of inference not covered in Statistics 36
Scalable k-means statistics with Titan.
Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre
2009-11-01T23:59:59.000Z
This report summarizes existing statistical engines in VTK/Titan and presents both the serial and parallel k-means statistics engines. It is a sequel to [PT08], [BPRT09], and [PT09] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, and contingency engines. The ease of use of the new parallel k-means engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the k-means engine.
Statistics Statistique Canada Canada
Sinnamon, Gordon J.
Statistics Statistique Canada Canada Human Resources and Ressources humaines et Skills Development Canada DÃ©veloppement des compÃ©tences Canada Culture,Tourism and the Centre for Education Statistics about this product or the wide range of services and data available from Statistics Canada, visit our
STATISTICAL COMPUTING AND GRAPHICS
Masci, Frank
Marron for a helpful comment. @ 1997 American Statistical Association bin width should be chosen soSTATISTICAL COMPUTING AND GRAPHICS Data-Based Choice of Histogram Bin Width The most important of the "optimal" bin width. Default bin widths in most common statistical packages are, at least for large samples
Kreinovich, Vladik
engineering applications, we are interested in computing statistics. For example, in environmental analysis, a reasonable statistic for estimating the mean value of a probability distribution is the population average EFast Algorithms for Computing Statistics Under Interval and Fuzzy Uncertainty
Derrien, H.; Harvey, J.A.; Larson, N.M.; Leal, L.C.; Wright, R.Q.
2000-05-01T23:59:59.000Z
The average {sup 235}U neutron total cross sections were obtained in the energy range 2 keV to 330 keV from high-resolution transmission measurements of a 0.033 atom/b sample.1 The experimental data were corrected for the contribution of isotope impurities and for resonance self-shielding effects in the sample. The results are in very good agreement with the experimental data of Poenitz et al.4 in the energy range 40 keV to 330 keV and are the only available accurate experimental data in the energy range 2 keV to 40 keV. ENDF/B-VI evaluated data are 1.7% larger. The SAMMY/FITACS code 2 was used for a statistical model analysis of the total cross section, selected fission cross sections and data in the energy range 2 keV to 200 keV. SAMMY/FITACS is an extended version of SAMMY which allows consistent analysis of the experimental data in the resolved and unresolved resonance region. The Reich-Moore resonance parameters were obtained 3 from a SAMMY Bayesian fits of high resolution experimental neutron transmission and partial cross section data below 2.25 keV, and the corresponding average parameters and covariance data were used in the present work as input for the statistical model analysis of the high energy range of the experimental data. The result of the analysis shows that the average resonance parameters obtained from the analysis of the unresolved resonance region are consistent with those obtained in the resolved energy region. Another important result is that ENDF/B-VI capture cross section could be too small by more than 10% in the energy range 10 keV to 200 keV.
BS in STATISTICS: Statistical Science Emphasis (695220) MAP Sheet Department of Statistics
Olsen Jr., Dan R.
BS in STATISTICS: Statistical Science Emphasis (695220) MAP Sheet Department of Statistics the following: Stat 121 Principles of Statistics Stat 151 Introduction to Bayesian Statistics Stat 201 Statistics for Engineers & Scientists Stat 301 Statistics & Probability for Sec Ed Note: Students who have
Statistics 221 Statistical Computing Methods Instructor: Mark Irwin
Irwin, Mark E.
Linear algebra, Statistics 111, and knowledge of a computer programming language. Statistics 220 (1988). Elements of Statistical Computing: Numerical Computation. CRC Press. Splus / R: Venables WNStatistics 221 Â Statistical Computing Methods Instructor: Mark Irwin Office: Science Center 235
Statistical Computing with R Eric Slud, Math. Dept., UMCP
Maryland at College Park, University of
Statistical Computing with R Eric Slud, Math. Dept., UMCP October 21, 2009 Overview of Course as indicated on the Course Syllabus. These fall roughly into three main headings: (A). R (& SAS) language and implementation of statistical algorithms, primarily in R; and (C). Data analysis and statistical applications
STATISTICS OF PRECIPITATION EXTREMES: QUANTIFYING CONFIDENCE IN TRENDS
Katz, Richard
this situation (e. g., "extRemes" package in open source statistical programming language R) Maximum likelihood1 STATISTICS OF PRECIPITATION EXTREMES: QUANTIFYING CONFIDENCE IN TRENDS Rick Katz Institute in Causes of Trends #12;4 (1) Introduction · Extreme value analysis under stationarity -- Statistical theory
STATISTICS OF EXTREMES IN CLIMATE CHANGE Richard W. Katz
Katz, Richard
analysis available within the open source statistical programming language R. #12;2 OUTLINE Lecture 1 (9STATISTICS OF EXTREMES IN CLIMATE CHANGE Richard W. Katz Institute for Study of Society the application of the statistical theory of extreme values to climate, in general, and to climate change
Statistical Computing with R Eric Slud, Math. Dept., UMCP
Maryland at College Park, University of
Statistical Computing with R Eric Slud, Math. Dept., UMCP August 30, 2009 Overview of Course as indicated on the Course Syllabus. These fall roughly into three main headings: (A). R (& SAS) language and implementation of statistical algorithms, primarily in R; and (C). Data analysis and statistical applications
1 Statistics Statistics plays an important role throughout society, providing
Vertes, Akos
1 Statistics STATISTICS Statistics plays an important role throughout society, providing data. They also explore how those skills can be applied to develop new initiatives. Statistics is one. UNDERGRADUATE Bachelor's program · Bachelor of Science with a major in statistics (http:// bulletin.gwu.edu/arts-sciences/statistics
Practical Statistical Thinking Probability: The Language of Statistics
Practical Statistical Thinking Probability: The Language of Statistics Essentials of Statistics and Probability Dhruv Sharma May 22, 2007 Department of Statistics, NC State University dbsharma@ncsu.edu SAMSI Undergrad Workshop Dhruv Sharma Essentials of Statistics and Probability #12;Practical Statistical Thinking
Edinburgh Research Explorer Statistical Constraints
Millar, Andrew J.
Edinburgh Research Explorer Statistical Constraints Citation for published version: Rossi, R that links statistics and constraint programming. We dis- cuss two novel statistical constraints and some, Prestwich, S & Tarim, SA 2014, 'Statistical Constraints' Paper presented at 21st biennial European
Part I STATISTICAL PHYSICS 1 Statistical Physics
unknown authors
In this first part of the book we shall study aspects of classical statistical physics that every physicist should know, but are not usually treated in elementary thermodynamics courses. Our study will lay the microphysical (particle-scale) foundations for the continuum physics of Parts II—VI. As a central feature of our approach, we shall emphasize the intimate connections between the relativistic formulation of statistical physics and its nonrelativistic limit, and between quantum statistical physics and the classical theory. Throughout, we shall presume that the reader is familiar with elementary thermodynamics, but not with other aspects of statistical physics. In Chap. 2 we will study kinetic theory — the simplest of all formalisms for analyzing systems of huge numbers of particles (e.g., molecules of air, or neutrons diffusing through a nuclear reactor, or photons produced in the big-bang origin of the Universe). In kinetic theory the key concept is the “distribution function ” or “number density of particles in phase space”, N; i.e., the number of particles per unit 3-dimensional volume of ordinary space and per unit 3-dimensional volume of momentum space. Despite first appearances, N turns out to be a geometric, frame-independent entity. This N and the frame-independent laws it
Part I STATISTICAL PHYSICS 1 Statistical Physics
unknown authors
2004-01-01T23:59:59.000Z
In this first part of the book we shall study aspects of classical statistical physics that every physicist should know, but are not usually treated in elementary thermodynamics courses. Our study will lay the microphysical (particle-scale) foundations for the continuum physics of Parts II—VI. As a central feature of our approach, we shall emphasize the intimate connections between the relativistic formulation of statistical physics and its nonrelativistic limit, and between quantum statistical physics and the classical theory. Throughout, we shall presume that the reader is familiar with elementary thermodynamics, but not with other aspects of statistical physics. In Chap. 2 we will study kinetic theory — the simplest of all formalisms for analyzing systems of huge numbers of particles (e.g., molecules of air, or neutrons diffusing through a nuclear reactor, or photons produced in the big-bang origin of the Universe). In kinetic theory the key concept is the “distribution function ” or “number density of particles in phase space”, N; i.e., the number of particles per unit 3-dimensional volume of ordinary space and per unit 3-dimensional volume of momentum space. Despite first appearances, N turns out to be a geometric, frame-independent entity. This N and the frame-independent laws it
Part I STATISTICAL PHYSICS 1 Statistical Physics
unknown authors
In this first part of the book we shall study aspects of classical statistical physics that every physicist should know but are not usually treated in elementary thermodynamics courses. This study will lay the microphysical (particle-scale) foundations for the continuum physics of Parts II—VI. Throughout, we shall presume that the reader is familiar with elementary thermodynamics, but not with other aspects of statistical physics. As a central feature of our approach, we shall emphasize the intimate connections between the relativistic formulation of statistical physics and its nonrelativistic limit, and between quantum statistical physics and the classical theory. Chapter 2 will deal with kinetic theory, which is the simplest of all formalisms for studying systems of huge numbers of particles (e.g., molecules of air, or neutrons diffusing through a nuclear reactor, or photons produced in the big-bang origin of the Universe). In kinetic theory the key concept is the “distribution function ” or “number density of particles in phase space”, N; i.e., the number of particles per unit 3-dimensional volume of ordinary space and per unit 3-dimensional volume of momentum space. Despite first appearances, N turns out to be a geometric, frame-independent entity. This N and the laws it obeys provide
Kulik, Rafal
R for Statistics Rafal Kulik Department of Mathematics and Statistics University of Ottawa Statistical Society of Ottawa 23 September 2011 Rafal Kulik #12;R for Statistics SSO meeting Plan Rafal Kulik 1 #12;R for Statistics SSO meeting Plan What is R? Rafal Kulik 1 #12;R for Statistics SSO meeting
thesis for the degree of licentiate of engineering Contributions to Statistical
Patriksson, Michael
thesis for the degree of licentiate of engineering Contributions to Statistical Analysis of Gene to Statistical Analysis of Gene Expression Data Anders SjÂ¨ogren c Anders SjÂ¨ogren, 2005 Licentiate Thesis ISSN
Nemati, Shamim, 1980-
2013-01-01T23:59:59.000Z
Physiological control systems involve multiple interacting variables operating in feedback loops that enhance an organism's ability to self-regulate and respond to internal and external disturbances. The resulting multivariate ...
Ba, Demba Elimane
2011-01-01T23:59:59.000Z
The formulation of multivariate point-process (MPP) models based on the Jacod likelihood does not allow for simultaneous occurrence of events at an arbitrarily small time resolution. In this thesis, we introduce two versatile ...
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2013 Instructor Amites Sarkar
Sarkar, Amites
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2013 Instructor Amites Sarkar Text phone number is 650 7569 and my e-mail is amites.sarkar@wwu.edu #12;Course Objectives The successful
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2014 Instructor Amites Sarkar
Sarkar, Amites
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2014 Instructor Amites Sarkar Text is 650 7569 and my e-mail is amites.sarkar@wwu.edu #12;Course Objectives The successful student
Math 224 Multivariable Calculus and Analytic Geometry I Spring 2014 Instructor Amites Sarkar
Sarkar, Amites
Math 224 Multivariable Calculus and Analytic Geometry I Spring 2014 Instructor Amites Sarkar Text-mail is amites.sarkar@wwu.edu #12;Course Objectives The successful student will demonstrate: 1. Understanding of
Math 225 Multivariable Calculus and Analytic Geometry II Spring 2012 Instructor Amites Sarkar
Sarkar, Amites
Math 225 Multivariable Calculus and Analytic Geometry II Spring 2012 Instructor Amites Sarkar Text, in 216 Bond Hall. My phone number is 650 7569 and my e-mail is amites.sarkar@wwu.edu #12;Course
Parallel contingency statistics with Titan.
Thompson, David C.; Pebay, Philippe Pierre
2009-09-01T23:59:59.000Z
This report summarizes existing statistical engines in VTK/Titan and presents the recently parallelized contingency statistics engine. It is a sequel to [PT08] and [BPRT09] which studied the parallel descriptive, correlative, multi-correlative, and principal component analysis engines. The ease of use of this new parallel engines is illustrated by the means of C++ code snippets. Furthermore, this report justifies the design of these engines with parallel scalability in mind; however, the very nature of contingency tables prevent this new engine from exhibiting optimal parallel speed-up as the aforementioned engines do. This report therefore discusses the design trade-offs we made and study performance with up to 200 processors.
Topology for statistical modeling of petascale data.
Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A& M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre; Gyulassy, Attila (University of Utah, Salt Lake City, UT); Thompson, David C.; Rojas, Joseph Maurice (Texas A& M University, College Station, TX)
2011-07-01T23:59:59.000Z
This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
Spin - or, actually: Spin and Quantum Statistics
Juerg Froehlich
2008-02-29T23:59:59.000Z
The history of the discovery of electron spin and the Pauli principle and the mathematics of spin and quantum statistics are reviewed. Pauli's theory of the spinning electron and some of its many applications in mathematics and physics are considered in more detail. The role of the fact that the tree-level gyromagnetic factor of the electron has the value g = 2 in an analysis of stability (and instability) of matter in arbitrary external magnetic fields is highlighted. Radiative corrections and precision measurements of g are reviewed. The general connection between spin and statistics, the CPT theorem and the theory of braid statistics are described.
Weakly sufficient quantum statistics
Katarzyna Lubnauer; Andrzej ?uczak; Hanna Pods?dkowska
2009-11-23T23:59:59.000Z
Some aspects of weak sufficiency of quantum statistics are investigated. In particular, we give necessary and sufficient conditions for the existence of a weakly sufficient statistic for a given family of vector states, investigate the problem of its minimality, and find the relation between weak sufficiency and other notions of sufficiency employed so far.
University of Technology, Sydney
, humankind has the ability to easily read and write DNA sequence, a code which describes the very essence of life itself. Global DNA sequencing operations currently generate 30TB data every day and the rate of this data generation more than doubles every year, outpacing Moore's Law. Sequence analysis has entered
Statistical laws in linguistics
Altmann, Eduardo G
2015-01-01T23:59:59.000Z
Zipf's law is just one out of many universal laws proposed to describe statistical regularities in language. Here we review and critically discuss how these laws can be statistically interpreted, fitted, and tested (falsified). The modern availability of large databases of written text allows for tests with an unprecedent statistical accuracy and also a characterization of the fluctuations around the typical behavior. We find that fluctuations are usually much larger than expected based on simplifying statistical assumptions (e.g., independence and lack of correlations between observations).These simplifications appear also in usual statistical tests so that the large fluctuations can be erroneously interpreted as a falsification of the law. Instead, here we argue that linguistic laws are only meaningful (falsifiable) if accompanied by a model for which the fluctuations can be computed (e.g., a generative model of the text). The large fluctuations we report show that the constraints imposed by linguistic laws...
Osnes, J.D. (RE/SPEC, Inc., Rapid City, SD (United States)); Winberg, A.; Andersson, J.E.; Larsson, N.A. (Sveriges Geologiska AB, Goeteborg (Sweden))
1991-09-27T23:59:59.000Z
Statistical and probabilistic methods for estimating the probability that a fracture is nonconductive (or equivalently, the conductive-fracture frequency) and the distribution of the transmissivities of conductive fractures from transmissivity measurements made in single-hole injection (well) tests were developed. These methods were applied to a database consisting of over 1,000 measurements made in nearly 25 km of borehole at five sites in Sweden. The depths of the measurements ranged from near the surface to over 600-m deep, and packer spacings of 20- and 25-m were used. A probabilistic model that describes the distribution of a series of transmissivity measurements was derived. When the parameters of this model were estimated using maximum likelihood estimators, the resulting estimated distributions generally fit the cumulative histograms of the transmissivity measurements very well. Further, estimates of the mean transmissivity of conductive fractures based on the maximum likelihood estimates of the model's parameters were reasonable, both in magnitude and in trend, with respect to depth. The estimates of the conductive fracture probability were generated in the range of 0.5--5.0 percent, with the higher values at shallow depths and with increasingly smaller values as depth increased. An estimation procedure based on the probabilistic model and the maximum likelihood estimators of its parameters was recommended. Some guidelines regarding the design of injection test programs were drawn from the recommended estimation procedure and the parameter estimates based on the Swedish data. 24 refs., 12 figs., 14 tabs.
Gunst, R. F.
2013-05-01T23:59:59.000Z
Phase 3 of the EPAct/V2/E-89 Program investigated the effects of 27 program fuels and 15 program vehicles on exhaust emissions and fuel economy. All vehicles were tested over the California Unified Driving Cycle (LA-92) at 75 degrees F. The program fuels differed on T50, T90, ethanol, Reid vapor pressure, and aromatics. The vehicles tested were new, low-mileage 2008 model year Tier 2 vehicles. A total of 956 test runs were made. Comprehensive statistical modeling and analyses were conducted on methane, carbon dioxide, carbon monoxide, fuel economy, non-methane hydrocarbons, non-methane organic gases, oxides of nitrogen, particulate matter, and total hydrocarbons. In general, model fits determined that emissions and fuel economy were complicated by functions of the five fuel parameters. An extensive evaluation of alternative model fits produced a number of competing model fits. Many of these alternative fits produce similar estimates of mean emissions for the 27 program fuels but should be carefully evaluated for use with emerging fuels with combinations of fuel parameters not included here. The program includes detailed databases on each of the 27 program fuels on each of the 15 vehicles and on each of the vehicles on each of the program fuels.
Applied Math & Statistics: Statistics Minor Curriculum Chart: 2013-2014
Stuart, Josh
Applied Math & Statistics: Statistics Minor Curriculum Chart: 2013-2014 http PSYC 181 CMPE 108 SOCY 103A CMPS 142 TIM 230 Information on the Statistics Minor The statistics minor as preparation for a graduate degree in statistics or biostatistics. Course Information With permission from
Statistics: Part 1 1. Why bother with statistics?
Francis, Paul
Statistics: Part 1 1. Why bother with statistics? Why is statistics so necessary for observational. But your data just don't seem to back up their claim. Statistics allows you to determine how confidently) practical introduction to those bits of statistics most vital to observational astronomy. 2. What
SDI: Statistical dynamic interactions
Blann, M.; Mustafa, M.G. (Lawrence Livermore National Lab., CA (USA)); Peilert, G.; Stoecker, H.; Greiner, W. (Frankfurt Univ. (Germany, F.R.). Inst. fuer Theoretische Physik)
1991-04-01T23:59:59.000Z
We focus on the combined statistical and dynamical aspects of heavy ion induced reactions. The overall picture is illustrated by considering the reaction {sup 36}Ar + {sup 238}U at a projectile energy of 35 MeV/nucleon. We illustrate the time dependent bound excitation energy due to the fusion/relaxation dynamics as calculated with the Boltzmann master equation. An estimate of the mass, charge and excitation of an equilibrated nucleus surviving the fast (dynamic) fusion-relaxation process is used as input into an evaporation calculation which includes 20 heavy fragment exit channels. The distribution of excitations between residue and clusters is explicitly calculated, as is the further deexcitation of clusters to bound nuclei. These results are compared with the exclusive cluster multiplicity measurements of Kim et al., and are found to give excellent agreement. We consider also an equilibrated residue system at 25% lower initial excitation, which gives an unsatisfactory exclusive multiplicity distribution. This illustrates that exclusive fragment multiplicity may provide a thermometer for system excitation. This analysis of data involves successive binary decay with no compressional effects nor phase transitions. Several examples of primary versus final (stable) cluster decay probabilities for an A = 100 nucleus at excitations of 100 to 800 MeV are presented. From these results a large change in multifragmentation patterns may be understood as a simple phase space consequence, invoking neither phase transitions, nor equation of state information. These results are used to illustrate physical quantities which are ambiguous to deduce from experimental fragment measurements. 14 refs., 4 figs.
Analysis of Semantic Building Blocks via Grobner Bases
Fernandez, Thomas
for the space of multivariate polynomials in n variables, capable of expressing arbitrary sums and productsAnalysis of Semantic Building Blocks via GrÂ¨obner Bases Jerry Swan1 , Geoffrey K. Neumann1 for greatest common divisor from univariate to multivariate polyno- mials. Since its invention in 1965, it has
Nonstationary statistical theory for multipactor
Anza, S.; Vicente, C.; Gil, J. [Aurora Software and Testing S.L., Edificio de Desarrollo Empresarial 9B, Universidad Politecnica de Valencia, Camino de Vera s/n, 46022 Valencia (Spain); Boria, V. E. [Departamento de Comunicaciones-iTEAM, Universidad Politecnica de Valencia, Camino de Vera s/n, 46022 Valencia (Spain); Gimeno, B. [Departamento de Fisica Aplicada y Electromagnetismo-ICMUV, Universitat de Valencia, c/Dr. Moliner, 50, 46100 Valencia (Spain); Raboso, D. [Payloads Systems Division, European Space Agency, 2200-AG Noordwijk (Netherlands)
2010-06-15T23:59:59.000Z
This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.
FISHERY STATISTICS E UNITED STATES
SH 11 .A443X FISH FISHERY STATISTICS E UNITED STATES ^ 1951 &ch 3. \\§^ ^/'· m:^ STATISTICAL DIGEST. Farley, Director Statistical Digest 30 FISHERY STATISTICS OF THE UNITED STATES 1951 BY A. W. ANDERSON;Fishery Statistics of the United States and Alaska are compiled and published annually to make available
School of Mathematics and Statistics
Du, Jie
School of Mathematics and Statistics Faculties of Arts Economics, Education, Engineering and Science INTERMEDIATE MATHEMATICS and STATISTICS 2012 THE UNIVERSITY OF SYDNEY #12;Contents 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 STAT2911 Probability and Statistical Models (Advanced) . . . . . . . . . . . 16 STAT2912
High Performance Multivariate Visual Data Exploration for Extremely Large Data
Rubel, Oliver; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes; Prabhat,
2008-08-22T23:59:59.000Z
One of the central challenges in modern science is the need to quickly derive knowledge and understanding from large, complex collections of data. We present a new approach that deals with this challenge by combining and extending techniques from high performance visual data analysis and scientific data management. This approach is demonstrated within the context of gaining insight from complex, time-varying datasets produced by a laser wakefield accelerator simulation. Our approach leverages histogram-based parallel coordinates for both visual information display as well as a vehicle for guiding a data mining operation. Data extraction and subsetting are implemented with state-of-the-art index/query technology. This approach, while applied here to accelerator science, is generally applicable to a broad set of science applications, and is implemented in a production-quality visual data analysis infrastructure. We conduct a detailed performance analysis and demonstrate good scalability on a distributed memory Cray XT4 system.
Beyth, M.; Broxton, D.; McInteer, C.; Averett, W.R.; Stablein, N.K.
1980-06-01T23:59:59.000Z
Multivariate statistical analysis to support the National Uranium Resource Evaluation and to evaluate strategic and other commercially important mineral resources was carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Montrose quadrangle, Colorado. The analysis suggests that: (1) the southern Colorado Mineral Belt is an area favorable for uranium mineral occurrences; (2) carnotite-type occurrences are likely in the nose of the Gunnison Uplift; (3) uranium mineral occurrences may be present along the western and northern margins of the West Elk crater; (4) a base-metal mineralized area is associated with the Uncompahgre Uplift; and (5) uranium and base metals are associated in some areas, and both are often controlled by faults trending west-northwest and north.
OTHER WORKSHEETS FOR R/S-PLUS: A P.M.E.Altham, Statistical Laboratory, University of Cambridge.
Lee, Stephen
-revenue data and vehicle safety data). New for 2008, Tompkins rankings of Cambridge colleges from 2000 to 2008. Time series analysis. 10. Survival data analysis. 11. The British monarchy data: a question. 12. New for July 2003: Mohammad Raza's multivariate data on 50 famous films. 17. Eight men behaving badly
Statistics and samples 1.1 What is statistics?
Irwin, Darren
1 1 Statistics and samples 1.1 What is statistics? Biologists study the properties of living things to get sam- pled and who did not. Statistics is a technology that describes and measures aspects of nature from samples. Most importantly, statistics lets us quantify the uncertainty of these meas- ures
Pearson's Goodness of Fit Statistic as a Score Test Statistic
Smyth, Gordon K.
Pearson's Goodness of Fit Statistic as a Score Test Statistic Gordon K. Smyth Abstract For any generalized linear model, the Pearson goodness of fit statistic is the score test statistic for testing and the residual deviance is therefore the relationship between the score test and the likelihood ratio test
A MULTIVARIATE FIT LUMINOSITY FUNCTION AND WORLD MODEL FOR LONG GAMMA-RAY BURSTS
Shahmoradi, Amir, E-mail: amir@physics.utexas.edu [Institute for Fusion Studies, The University of Texas at Austin, TX 78712 (United States)] [Institute for Fusion Studies, The University of Texas at Austin, TX 78712 (United States)
2013-04-01T23:59:59.000Z
It is proposed that the luminosity function, the rest-frame spectral correlations, and distributions of cosmological long-duration (Type-II) gamma-ray bursts (LGRBs) may be very well described as a multivariate log-normal distribution. This result is based on careful selection, analysis, and modeling of LGRBs' temporal and spectral variables in the largest catalog of GRBs available to date: 2130 BATSE GRBs, while taking into account the detection threshold and possible selection effects. Constraints on the joint rest-frame distribution of the isotropic peak luminosity (L{sub iso}), total isotropic emission (E{sub iso}), the time-integrated spectral peak energy (E{sub p,z}), and duration (T{sub 90,z}) of LGRBs are derived. The presented analysis provides evidence for a relatively large fraction of LGRBs that have been missed by the BATSE detector with E{sub iso} extending down to {approx}10{sup 49} erg and observed spectral peak energies (E{sub p} ) as low as {approx}5 keV. LGRBs with rest-frame duration T{sub 90,z} {approx}< 1 s or observer-frame duration T{sub 90} {approx}< 2 s appear to be rare events ({approx}< 0.1% chance of occurrence). The model predicts a fairly strong but highly significant correlation ({rho} = 0.58 {+-} 0.04) between E{sub iso} and E{sub p,z} of LGRBs. Also predicted are strong correlations of L{sub iso} and E{sub iso} with T{sub 90,z} and moderate correlation between L{sub iso} and E{sub p,z}. The strength and significance of the correlations found encourage the search for underlying mechanisms, though undermine their capabilities as probes of dark energy's equation of state at high redshifts. The presented analysis favors-but does not necessitate-a cosmic rate for BATSE LGRBs tracing metallicity evolution consistent with a cutoff Z/Z{sub Sun} {approx} 0.2-0.5, assuming no luminosity-redshift evolution.
anti-parasite cytokine responses: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
and Fasciola hepatica-rat were statistically coanalyzed. 1 H NMR spectroscopy and multivariate statistical analysis were used to characterize the urine and plasma...
affect plasma cytokine: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
and Fasciola hepatica-rat were statistically coanalyzed. 1 H NMR spectroscopy and multivariate statistical analysis were used to characterize the urine and plasma...
Statistical Learning Theory of Protein Dynamics
Haas, Kevin
2013-01-01T23:59:59.000Z
integrated statistical learning and simulation approach tomolecular simulation, using statistical learning theory tomolecular simulation and statistical learning theory of
UNIQUE NUMBERS: 04610, 04620 STATISTICS AND MODELING
Ghosh, Joydeep
. The Statistical Analysis part of the course focuses on determining the existence of relationships between variables and the quantification of such relationships. The main tool we will use to determining the existence of relationships between variables in data and to quantify the strength of the relationships
Al-Nasir, Abdul Majid Hamza
1968-01-01T23:59:59.000Z
ORDER RELATIONS AND PRIOR DISTRIBU 'IONS IN 1:-IE ESTXYJiTION OF MULTIVARIATE NOPSLAL PARAI'E&iTiS NI~N PARTIAL DATA A Thesis by ABDUL MAJID HA?ZA AL-NASZR Submitt d o the Grad. nate College oi' Texas UM Univ rsity in partial fu' fillment . f... as to style and content by: Chairman oi Committee Head oF Department ?'? Aug st 1968 ABS ~~CT Order Relations and Prior Distributions in the Bstimation of Multivariate Normal Parameters with Part'al Data. (August 1)68) Abdul Madrid Hamza Al-N!asir B...
Experimental Mathematics and Computational Statistics
Bailey, David H.; Borwein, Jonathan M.
2009-04-30T23:59:59.000Z
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
STATISTICS ESSENTIAL SKILL OUTCOME STATEMENTS
Gering, Jon C.
to statistical literacy. The American Statistical Association endorsed the Guidelines for AssessmentSB #3205 STATISTICS ESSENTIAL SKILL OUTCOME STATEMENTS Newly-Revised Version: A liberally educated person is capable of being both a producer and a consumer of statistical information with some basic
General Indicators: Performance Statistics
Webb, Peter
Bank Central Svcs. Energy Mgmt Admin Preventive Maintenance: Fire/Life/Safety (FLS) 87% -4% 100% 47 Trend Initial Target Steam Chilled Utility Outages 2 +1% 0 0 0 Preventive Maintenance: Fire participation reading will be June 2012 *Budget data FYTD through March 2012 Maintenance: Performance Statistics
General Indicators: Performance Statistics
Webb, Peter
Bank Central Svcs. Energy Mgmt Admin Preventive Maintenance: Fire/Life/Safety (FLS) 85% -12% 100% 80 Trend Initial Target Steam Chilled Utility Outages 3 -5 0 1 0 Preventive Maintenance: Fire participation reading will be August 2012 *Budget data FYTD through May 2012 Maintenance: Performance Statistics
General Indicators: Performance Statistics
Webb, Peter
. Energy Mgmt Admin Preventive Maintenance: Fire/Life/Safety (FLS) 97% +10% 100% 96% 93% 99% 98% Non Chilled Utility Outages 8 +6% 0 1 2 Preventive Maintenance: Fire/Life/Safety (FLS) 100% No Change 100 will be June 2012 *Budget data FYTD through April 2012 Maintenance: Performance Statistics Current Month Change
BS in STATISTICS: Statistical Science Emphasis (695220) MAP Sheet Department of Statistics
Seamons, Kent E.
the statistics list: C S 142 Introduction to Computer Programming Math 334 Ordinary Differential Equations Math Academic Internship: Statistics Stat 497R Introduction to Statistical Research Stat 535 Applied Linear are strongly recommended to choose electives to prepare for the BYU BS/MS statistics integrated program
Spin, Statistics, and Reflections, II. Lorentz Invariance
Bernd Kuckert; Reinhard Lorenzen
2005-12-21T23:59:59.000Z
The analysis of the relation between modular P$_1$CT-symmetry -- a consequence of the Unruh effect -- and Pauli's spin-statistics relation is continued. The result in the predecessor to this article is extended to the Lorentz symmetric situation. A model $\\G_L$ of the universal covering $\\widetilde{L_+^\\uparrow}\\cong SL(2,\\complex)$ of the restricted Lorentz group $L_+^\\uparrow$ is modelled as a reflection group at the classical level. Based on this picture, a representation of $\\G_L$ is constructed from pairs of modular P$_1$CT-conjugations, and this representation can easily be verified to satisfy the spin-statistics relation.
Bastin, Georges
Nonlinear Predictive Control of Cement Mills Lalo Magni, Georges Bastin, and Vincent Wertz Abstract--A new multivariable controller for cement milling circuits is presented, which is based on a nonlinear model: a change of hardness of the raw material. Index Terms--Cement industry, multivariable control systems
Franklin, W. Randolph
5D-ODETLAP: A NOVEL FIVE-DIMENSIONAL COMPRESSION METHOD ON TIME-VARYING MULTIVARIABLE GEOSPATIAL dimensional (5D) geospatial dataset consists of several multivariable 4D datasets, which are sequences of time technique for 5D geospatial data as a whole, instead of applying 3D compression method on each 3D slice
$ri{sVor. 3 No.2 DncnNasrnL995 A MultivariateIndexMethodologyfor LandfillSite
Nebraska-Lincoln, University of
- ment of a cost-effective, non-intrusive screening methodology to determine site characteristics) Develop a multivariate index methodology to combine multiple sources of non-intrusive data to arrive of non- intrusive information to createamultivariate index. This multivariate index canbe utilized
Human-Automation Collaboration in Complex Multivariate Resource Allocation Decision Support Systems
Cummings, Mary "Missy"
1 Human-Automation Collaboration in Complex Multivariate Resource Allocation Decision Support Systems M.L. Cummings* Sylvain Bruni Humans and Automation Laboratory 77 Massachusetts Ave, 33 uncertainty, typical of supervisory control environments, it is critical that some balance of human-automation
Stoffelen, Ad
forecast errors of the European Centre for Medium-Range Weather Forecasts (ECMWF) model. Tropical massÂwindImpact Assessment of Simulated Doppler Wind Lidars with a Multivariate Variational Assimilation, De Bilt, Netherlands CHRISTOPHE ACCADIA AND PETER SCHLÃ?SSEL European Organisation
A Multivariate Randomization Test of Association Applied to Cognitive Test Results
A Multivariate Randomization Test of Association Applied to Cognitive Test Results Albert Ahumada and Bettina Beard NASA Ames Research Center Abstract Randomization tests provide a conceptually simple, distribution-free way to implement significance testing. We have applied this method to the problem
Detecting Climate Change in Multivariate Time Series Data by Novel Clustering and Cluster Tracing Aachen University, Germany {kremer, guennemann, seidl}@cs.rwth-aachen.de Abstract--Climate change can series, and trace the clusters over time. A climate pattern is categorized as a changing pattern
Fully Simplified Multivariate Normal Updates in Non-Conjugate Variational Message Passing
Wand, Matt
Fully Simplified Multivariate Normal Updates in Non-Conjugate Variational Message Passing BY M updates in non-conjugate vari- ational message passing approximate inference schemes are obtained factors in variational message passing approximate inference schemes. Dubbed non-conjugate variational
Supersonic combustion studies using a multivariate quadrature based method for combustion modeling
Raman, Venkat
Supersonic combustion studies using a multivariate quadrature based method for combustion modeling function (PDF) of thermochemical variables can be used for accurately computing the combustion source term of predictive models for supersonic combustion is a critical step in design and development of scramjet engines
Dobigeon, Nicolas
truncated on a simplex Nicolas Dobigeon and Jean-Yves Tourneret E-mail: dobigeon@umich.edu TECHNICAL REPORT simplex: S = r 0, r = 1, . . . , R - 1, R-1 r=1 r 1 , (1) Let NS(A, B) denote the truncated multivariate normal distribution defined on the simplex S with mean vector A and covariance matrix B
SAMPLING FROM A MULTIVARIATE GAUSSIAN DISTRIBUTION TRUNCATED ON A SIMPLEX: A REVIEW
Dobigeon, Nicolas
SAMPLING FROM A MULTIVARIATE GAUSSIAN DISTRIBUTION TRUNCATED ON A SIMPLEX: A REVIEW Yoann Altmann to the standard simplex. First, a classical Gibbs sampler is presented. Then, two Hamiltonian Monte Carlo methods focus on the special case where the parame- ters to be sampled belong to a standard simplex, i
Efficient sampling according to a multivariate Gaussian distribution truncated on a simplex
Dobigeon, Nicolas
1 Efficient sampling according to a multivariate Gaussian distribution truncated on a simplex I. PROBLEM STATEMENT Let S denote the following simplex defined on RR-1 : S = r 0, r = 1) defined on the simplex S with mean vector A and covariance matrix B. The probability density function (pdf
Scott, Clayton
Nonparametric Assessment of Contamination in Multivariate Data Using Generalized Quantile Sets the measurements in these datasets by the extent to which they represent `nominal' versus `contaminated' instances principles, in the context of contaminated data, is new, and estimation of the key underlying quantities
Al-Duwaish, Hussain N.
Effective and successful control of real life processes in the industry is a meticulous and non-trivial task@kfupm.edu.sa ABSTRACT Of the many model structures that can represent a nonlin- ear process effectively, the Hammerstein problem of modelling a nonlinear multivariable steam generating plant using the methods of system
Shneiderman, Ben
on health care. The U.S. Department of Health and Human Services keeps track of a variety of health care that enables users to visualize health care data in multivariate space as well as geospatially. It is designed a compre- hensible and powerful interface for policy makers to visualize health care quality, public health
A MULTIVARIATE SKEW-GARCH Giovanni De Luca, Marc G. Genton and
Genton, Marc G.
A MULTIVARIATE SKEW-GARCH MODEL Giovanni De Luca, Marc G. Genton and Nicola Loperfido ABSTRACT conditionally on the sign of the one- day lagged US return is skew-normal. The resulting model is denoted Skew-GARCH) and the Exponential GARCH model (Nelson, 1991) allowing for the inclusion of the asymmetric effect of volatility
Forecasting the Hourly Ontario Energy Price by Multivariate Adaptive Regression Splines
CaÃ±izares, Claudio A.
1 Forecasting the Hourly Ontario Energy Price by Multivariate Adaptive Regression Splines H. In this paper, the MARS technique is applied to forecast the hourly Ontario energy price (HOEP). The MARS models values of the latest pre- dispatch price and demand information, made available by the Ontario
Paris-Sud XI, Université de
1 Optimization of an artificial neural network dedicated to the multivariate forecasting of daily Ajaccio, France Abstract. This paper presents an application of Artificial Neural Networks (ANNs Artificial Neural Networks (ANNs) which are a popular artificial intelligence technique in the forecasting
Fernandez, Thomas
Comparative application of artificial neural networks and genetic algorithms for multivariate time of artificial neural networks and genetic algorithms in terms of forecasting and understanding of algal blooms-a, Microcystis, short-term prediction, artificial neural network model, genetic algorithm model, rule sets
Cost-efficiency in multivariate Levy models Ludger Ruschendorf*, Viktor Wolf*
Rüschendorf, Ludger
Cost-efficiency in multivariate L´evy models Ludger R¨uschendorf*, Viktor Wolf* November 5, 2014 Abstract In this paper we determine lowest cost strategies for given payoff distributions called cost-efficient on the pricing of efficient versions of univariate payoffs. We state various relevant existence and uniqueness
and Conditions Agreement. References [1] J. C. Bezdek. Pattern Recognition with FuzzyObjective Func- tionUnsupervised Fuzzy Clustering of Multi-variate Image Data Klaus Baggesen Hilger and Allan Aasbjerg Nielsen IMM, Department of Mathematical Modelling Technical University of Denmark, 2800 Lyngby, Denmark e
Blei, Ron
, and Fabiana Cardetti Mathematics Education REU Project University of Connecticut Summer 2012 BACKGROUND and gains. Proceedings of the Thirty First Annual Meeting of the North American Chapter of the International supplementary application-based tutorials in the multivariable calculus course. International Journal
Multivariate Non-stationary Stochastic Streamflow Models for Two Urban , R. Vogel2
Vogel, Richard M.
as changes in the water use cycle. Urbanization leads to construction of water distribution systems and storm1 Multivariate Non-stationary Stochastic Streamflow Models for Two Urban Watersheds M. Ng1 , R: urbanization and changes in climate and water use are examples of such influences. The evolution
Kimball, Sarah
making, as well as for taking prompt and effective actions to avoidreduce the effects of droughts to characterize agricultural and hydrological droughts (Hayes et al. 2011). The standard- ization conceptA Nonparametric Multivariate Multi-Index Drought Monitoring Framework ZENGCHAO HAO AND AMIR
Ilchmann, Achim; Pahl, M. : Adaptive Multivariable pH Regulation of a Biogas Tower Reactor
Knobloch,JÃ¼rgen
Ilchmann, Achim; Pahl, M. : Adaptive Multivariable pH Regulation of a Biogas Tower Reactor Zuerst for a biogastower reactor.The reactor is a new type for anaerobic treatment of waste water. It has been developed. There are uunterollsapplicationsof control theory resr-rltsto single-inpr-rtsingle-outputpH control of stirred tarrk reactors: see
A SERVICE FRAMEWORK FOR LEARNING, QUERYING, AND MONITORING MULTIVARIATE TIME SERIES
Lin, Jessica
Drive MSN 4A5, Fairfax, Virginia 22030- 4422, United States cngan@gmu.edu, brodsky@gmu.edu, jessica monitor daily stock prices, weekly interest rates, and monthly price indices to analyze different states death rates in a population region. To support such decision-marking and determination over multivariate
Frederi G. Viens Professor of Statistics and Mathematics
Viens, Frederi G.
history on page 13) 19921996 National Defense Science and Engineering Graduate Fellow, U.C. Irvine 1996 U calculus, Elementary Probability, Elementary Statistics. Undergraduate upper division: Discrete mathematics, Linear Algebra, Intermediate probability and statistics, Real Analysis, Actuarial Models (life
Frederi G. Viens Professor of Statistics and Mathematics
Viens, Frederi G.
history on page 13) 1992-1996 National Defense Science and Engineering Graduate Fellow, U.C. Irvine 1996 U calculus, Elementary Probability, Elementary Statistics. Undergraduate upper division: Discrete mathematics, Linear Algebra, Intermediate probability and statistics, Real Analysis, Actuarial Models (life
EPI BIO 560 STATISTICAL CONSULTING 0.5 credit
Contractor, Anis
that are reproducible. Lecture topics include sample size and power calculation, data collection, data presentation, selecting appropriate statistical methods, time and project management, report writing, and reproducible representation. · Perform sample size and power analysis by hand and with a statistical software package
Noisy Independent Factor Analysis Model for Density Estimation and Classification
Amato, U.
2009-06-09T23:59:59.000Z
We consider the problem of multivariate density estimation when the unknown density is assumed to follow a particular form of dimensionality reduction, a noisy independent factor analysis (IFA) model. In this model the ...
accident location analysis: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
At the end, some safety-related recommendations are listed. Balda, F 2002-01-01 54 Multivariate data analysis based on two location vectors and two scatter matrices...
Parameterization and Statistical Analysis of Hurricane Waves
Mclaughlin, Patrick William
2014-05-03T23:59:59.000Z
.3 WRF Station location and map of surrounding area for Gulfport, MS. 70 O.C. stations shown in red and 95 Bay stations shown in green. Note: Bay of St. Louis located on Western side of map, while Biloxi Bay is to the East... event (vertical red line) occurs 1 hour after maximum surge event (vertical blue line). ............................................................................... 37 Figure 4.2 Time series for: Top- Bay station #25 (Bay of St. Louis...
ARTICLE IN PRESS Computational Statistics & Data Analysis ( )
Li, Jia
growth of textual information available in the pub- lic World Wide Web, corporate intranets, news wires-way Poisson mixture models for simultaneous document classification and word clustering Jia Lia, , Hongyuan to simultaneous document classification and word clustering is developed using a two-way mixture model of Poisson
Statistical Analysis of Molecular Signal Recording
Glaser, Joshua I.
A molecular device that records time-varying signals would enable new approaches in neuroscience. We have recently proposed such a device, termed a “molecular ticker tape”, in which an engineered DNA polymerase (DNAP) ...
A statistical analysis of tire tread wear
Sperberg, Ronald Leigh
1965-01-01T23:59:59.000Z
positions. Thexe was obvious significant interraction here me the tires change@ ~%tines framn front. to rear wheel positions tbe average ~ wae increased. This test wms assigned . to show. tbe. similax it@ be tween the different veriables. , ln... . this test, the first in , ' . ~etioa peri@4 wms-4ifferemt, from anT. . . of ahe other. yqrixmbx. Peri~ 8, Xy?an@ XV had similar effects on the weax pat- terns as 4M peri~ 2$, 2, ane! Rl. The ~steat simiiari- the 4th, 24th, 8th, 11th, 18th, Mth~ 15th...
STATISTICAL ANALYSIS OF A DETERMINISTIC STOCHASTIC ORBIT
Kaufman, Allan N.
2013-01-01T23:59:59.000Z
DETERMINISTIC STOCHASTIC ORBIT Allan N. Kaufman, Henry O.I.a Deterministic Stochastic Orbit* Allan N. Kaufman, Henry D.
Statistical Energy Analysis 2 Long. 3 Long.
Berlin,Technische Universität
Reciprocity Considerations Multi-modal Systems Multi-modal Coupling Coupling Lossfactor Variance. B.A.T. Petersson #12;6 · Built-up structures Heat pump · Structure-borne waves · Liquid · Single-degree-of-freedom-system, SDOF M RK=1/C x © Prof. B.A.T. Petersson SEA · Forced vibration M RK x F
Statistical Problems in DNA Microarray Data Analysis
Wang, Nancy Naichao
2009-01-01T23:59:59.000Z
editors, Bioinformatics and Computational Biology Solutionseditors, Bioinformatics and Computational Biology Solutions
Independent Statistics & Analysis Drilling Productivity Report
Annual Energy Outlook 2013 [U.S. Energy Information Administration (EIA)]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645 3,625 1,006 492 742 33 111 1,613 122 40 Buildingto17 3400, U.S.MajorMarketsNov-14 Dec-14Has| Methodology
Homotopy in statistical physics
Ralph Kenna
2006-04-12T23:59:59.000Z
In condensed matter physics and related areas, topological defects play important roles in phase transitions and critical phenomena. Homotopy theory facilitates the classification of such topological defects. After a pedagogic introduction to the mathematical methods involved in topology and homotopy theory, the role of the latter in a number of mainly low-dimensional statistical-mechanical systems is outlined. Some recent activities in this area are reviewed and some possible future directions are discussed.
Hierarchical Linear Discriminant Analysis for Beamforming
Park, Haesun
model of h-LDA by relating it to the two-way multivariate analysis of variance (MANOVA), which fits well dimension reduction, hierarchical linear discriminant analysis (h-LDA) to a well-known spatial localization1 Hierarchical Linear Discriminant Analysis for Beamforming Jaegul Choo , Barry L. Drake
Bayesian Statistics Stochastic Simulation -Gibbs sampling
Wright, Francis
Bayesian Statistics Stochastic Simulation - Gibbs sampling Bayesian Statistics - an Introduction Dr Pettit Bayesian Statistics - an Introduction #12;Bayesian Statistics Stochastic Simulation - Gibbs sampling What is Bayesian Statistics? Bayes Theorem The Likelihood Principle Mixtures of conjugate priors
C. Shane Reese Department of Statistics
Reese, Shane
, American Statistical Association 2012 Melvin W. Carter Professorship 2010 Karl G. Maeser Excellence.) 2010 American Statistical Association Excellence in Statistics in Sports Award 2008 Howard Christensen Department of Statistics Honored Faculty Member 2001 Journal of the American Statistical Association
STAT 639V: Topics in Statistics Statistical Computing
Petris, Giovanni
methods. Most of the topics will be presented in the context of the R statistical computing language (see below). Computing: The computer language we will be using is R. The latest version of R is installedSTAT 639V: Topics in Statistics Fall 2014 Statistical Computing General Information: Class hours
Multivariable analysis of spectral measurements for the characterization of semiconductor processes
White, David A. (David Allan), 1966-
2001-01-01T23:59:59.000Z
The availability of affordable and reliable optical sensor technology and the abundance of data that these sensors now provide have created new opportunities to better characterize and control semiconductor processes in ...
Tabchy, Adel B. (Adel Bassam), 1976-
2004-01-01T23:59:59.000Z
We have identified the genes responsible for SERCA2a-induced reversal of heart failure, the leading cause of morbidity and mortality in the United States and developed countries. We have previously shown that restoration ...
Journal of Multivariate Analysis 92 (2005) 186204 Covariate selection for semiparametric hazard
McKeague, Ian
partial likelihood--henceforth PPL [16], a backwards elimination covariate selection method [9], Bayesian model averaging [14,15], Bayesian variable selection [8], the lasso method for PPL [17], and nonconcave PPL [7]. Large sample
INDIAN STATISTICAL INSTITUTE Annual Report
Bandyopadhyay, Antar
Division 8 Stat-Math Unit (SMU), Kolkata 8 Stat-Math Unit (SMU), Delhi 11 Stat-Math Unit (SMU), Bangalore Stat-Math Unit (SMU), Chennai 14 16 Applied Statistics Division 17 Applied Statistics Unit (ASU
STATISTICAL MECHANICS AND FIELD THEORY
Samuel, S.A.
2010-01-01T23:59:59.000Z
York. K. Bardakci, Field Theory for Solitons, II, BerkeleyFart I Applications of Field Theory Methods to StatisticalStatistical Mechanics to Field Theory Chapter IV The Grand
Business Statistics 207 Summer, 2013
Barrash, Warren
Business Statistics 207 Summer, 2013 Instructor: Phil Fry Office: MBEB 3249 e-mail: pfry) & by Appointment Textbook: Business Statistics: A Decision-Making Approach, 8th ed. by Groebner, Shannon, Fry: This is the first semester of a two semester course in business statistics. The objective of BUSSTAT 207
Fertilizer Statistics for Texas.
Fraps, G. S. (George Stronach)
1927-01-01T23:59:59.000Z
) H. B. PARKS, B. S.. Apiculturist in Charge A. H. ALEX, B. S.. Queen Breeder FEED CONTROL SERVICE: F. D. FULLER, M. S.. Chief S. D. PEARCE, Secretary J. 13. ROGERS, Feed Inspector W. H. WOOD. Feed Inspector K. L KIRKLAND B S Feed Inspector W... and for selected counties. Estimated prices of plant food are given. The sales of fer- tilizer in the spring are partly related to the price of cotton in the preceding fall and winter. This is shown by statistical methods. CONTENTS ,. A . . . . ,., l...
Office of Environmental Management (EM)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645 3,625 1,006 492 742 33 1112011 Strategic Plan| Department of.pdf6-OPAMDepartment6 FY 2007 FY 2008State TablesStatistical
ARM - Historical Visitor Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645 3,625 1,006Datastreamstwrcam40m DocumentationJanuary 9, 2009 [Events, FeatureListGeneral ChangesFieldVisitor Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for4,645U.S. DOE Office of Science (SC)IntegratedSpeeding access toTest andOptimize832 2.860 2.864 2.867039 J -Statistics
BS in STATISTICS: Statistical Science Emphasis (695220) MAP Sheet Department of Statistics
Seamons, Kent E.
the statistics list: C S 142 Introduction to Computer Programming Math 334 Ordinary Differential Equations Math Applied Time Series and Forecasting Stat 474 Theory of Interest Stat 496R Academic Internship: Statistics Stat 497R Introduction to Statistical Research Stat 535 Applied Linear Models Stat 536 Modern
Spatial Analysis Methods in Demography & Sustainability Science Rob Strawderman
Angenent, Lars T.
and geospatial analysis and associated statistical methods. A web portal on geospatial analysis and statisticsSpatial Analysis Methods in Demography & Sustainability Science Rob Strawderman Department Rice Hall The analysis of spatially explicit data is common to population research, especially
arXiv:1205.2064v1[astro-ph.IM]9May2012 Statistical Methods for Astronomy
Masci, Frank
in various fields of statistics, and Second, the public domain R software system for statistical analysis. To- gether with its 3500 (and growing) add-on CRAN packages, R implements a vast range of statistical procedures in a coherent high-level language with advanced graphics. 1 Role and history of statistics
URBAN AND REGIONAL ANALYSIS: GEG 336 Gustavus Adolphus College
Fabrikant, Sara Irina
package that uses ArcView shapefiles and has mapping and statistical display and analysis capabilities
analysis issues relating: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
are Risks QualityCost analysis teaches the company 4 Statistical Network Analysis: Models, Issues, Mathematics Websites Summary: ) . . . . . . . . . . . . . . . . . . . . ....
1 Intro to R R is a programming language and environment for statistical computing and graphics.
Guo, Zaoyang
Workshop 1 1 Intro to R R is a programming language and environment for statistical computing and graphics. R provides a wide variety of statistical (linear and nonlinear modelling, classical statistical tests, time-series analysis, classification, clustering, ...) and graphical techniques. One of R
Application for SAS Certificate Advanced Statistics
Dahl, David B.
Application for SAS Certificate Advanced Statistics ___________________________________________________________________________________ Student Signature _________________________________________ Date I am a Statistics Major at Brigham Young of Statistics to access my academic information. Brigham Young University Department of Statistics offers
The University of Chicago Department of Statistics
The University of Chicago Department of Statistics Seminar Series PETER HALL Department of Mathematics and Statistics University of Melbourne, Australia Contemporary Frontiers in Statistics THURSDAY and future directions of frontier problems in statistics For further information and about building access
FROM STATISTICAL SIGNIFICANCE TO EFFECT ESTIMATION
Burgman, Mark
such as the American Psychological Association, and the absence of appropriate editorial pressure, statistical reformFROM STATISTICAL SIGNIFICANCE TO EFFECT ESTIMATION: STATISTICAL REFORM IN PSYCHOLOGY, MEDICINE ABSTRACT Compelling criticisms of statistical significance testing (or Null Hypothesis Significance Testing
APS Operational Statistics for FY 2005
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
asdopslogo1.gif (2896 bytes) APS FY2005 Operational Statistics Back to Main Statistics Page FY 2005 Year-to-Date Statistics 2005 Statistics Summary HTML or PDF FY 2005...
Functional Data Analysis With Multi Layer Perceptrons
Fleuret, FranÃ§ois
Analysis. We introduce a computation model for functional input data and we show that this model is a wellFunctional Data Analysis With Multi Layer Perceptrons Fabrice Rossi , Brieuc Conan-Guez and Fran as clas- sical multivariate data, because they are in general de- scribed by a finite set of input
Vortex methods and vortex statistics
Chorin, A.J.
1993-05-01T23:59:59.000Z
Vortex methods originated from the observation that in incompressible, inviscid, isentropic flow vorticity (or, more accurately, circulation) is a conserved quantity, as can be readily deduced from the absence of tangential stresses. Thus if the vorticity is known at time t = 0, one can deduce the flow at a later time by simply following it around. In this narrow context, a vortex method is a numerical method that makes use of this observation. Even more generally, the analysis of vortex methods leads, to problems that are closely related to problems in quantum physics and field theory, as well as in harmonic analysis. A broad enough definition of vortex methods ends up by encompassing much of science. Even the purely computational aspects of vortex methods encompass a range of ideas for which vorticity may not be the best unifying theme. The author restricts himself in these lectures to a special class of numerical vortex methods, those that are based on a Lagrangian transport of vorticity in hydrodynamics by smoothed particles (``blobs``) and those whose understanding contributes to the understanding of blob methods. Vortex methods for inviscid flow lead to systems of ordinary differential equations that can be readily clothed in Hamiltonian form, both in three and two space dimensions, and they can preserve exactly a number of invariants of the Euler equations, including topological invariants. Their viscous versions resemble Langevin equations. As a result, they provide a very useful cartoon of statistical hydrodynamics, i.e., of turbulence, one that can to some extent be analyzed analytically and more importantly, explored numerically, with important implications also for superfluids, superconductors, and even polymers. In the authors view, vortex ``blob`` methods provide the most promising path to the understanding of these phenomena.
Statistics as a dynamical attractor
Michail Zak
2012-08-30T23:59:59.000Z
It is demonstrated that any statistics can be represented by an attractor of the solution to a corresponding systen of ODE coupled with its Liouville equation. Such a non-Newtonian representation allows one to reduce foundations of statistics to better established foundations of ODE. In addition to that, evolution to the attractor reveals possible micro-mechanisms driving random events to the final distribution of the corresponding statistical law. Special attention is concentrated upon the power law and its dynamical interpretation: it is demonstrated that the underlying dynamics supports a " violent reputation" of the power law statistics.
Small Business Goals and Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
& Statistics The Idaho National Laboratory (INL) is committed to supporting the small business objectives of the U.S. Government and the Department of Energy (DOE) and recognizes...
Response to ISRP Comments Monitoring and Evaluation Statistical Support
to optimize spill and flow strategies to best benefit fish survival. 2. Evaluate the potential effects of alternative hydromanagement strategies on salmonid passage and survival. 3. Understand the effect and the development of statistical theory and models for proper analysis. Data analysis is not a primary component
Computer, Computational, and Statistical Sciences Division
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Computing CCS Division Computer, Computational, and Statistical Sciences Division Computational physics, computer science, applied mathematics, statistics and the integration of...
Damage detection in mechanical structures using extreme value statistic.
Worden, K.; Allen, D. W. (David W.); Sohn, H. (Hoon); Farrar, C. R. (Charles R.)
2002-01-01T23:59:59.000Z
The first and most important objective of any damage identification algorithms is to ascertain with confidence if damage is present or not. Many methods have been proposed for damage detection based on ideas of novelty detection founded in pattern recognition and multivariate statistics. The philosophy of novelty detection is simple. Features are first extracted from a baseline system to be monitored, and subsequent data are then compared to see if the new features are outliers, which significantly depart from the rest of population. In damage diagnosis problems, the assumption is that outliers are generated from a damaged condition of the monitored system. This damage classification necessitates the establishment of a decision boundary. Choosing this threshold value is often based on the assumption that the parent distribution of data is Gaussian in nature. While the problem of novelty detection focuses attention on the outlier or extreme values of the data i.e. those points in the tails of the distribution, the threshold selection using the normality assumption weighs the central population of data. Therefore, this normality assumption might impose potentially misleading behavior on damage classification, and is likely to lead the damage diagnosis astray. In this paper, extreme value statistics is integrated with the novelty detection to specifically model the tails of the distribution of interest. Finally, the proposed technique is demonstrated on simulated numerical data and time series data measured from an eight degree-of-freedom spring-mass system.
Stine, Robert A.
Requirements for Statistics Concentration 6/9/11 The Statistics concentration or major are required, with at least 3 credit units from Statistics. STAT 621 may contribute in Statistics The following courses offered by the Department of Statistics are eligible
International petroleum statistics report
NONE
1997-07-01T23:59:59.000Z
The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
International petroleum statistics report
NONE
1995-11-01T23:59:59.000Z
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
Models for Millions Department of Statistics
Stine, Robert A.
Models for Millions Bob Stine Department of Statistics The Wharton School, UniversityDepartment of Statistics Introduction #12;WhartonDepartment of Statistics WhartonDepartment of Statistics Statistics in the News Hot topics Big Data Business Analytics Data Science Are the authors talking about statistics
Emergent irreversibility and entanglement spectrum statistics
Claudio Chamon; Alioscia Hamma; Eduardo R. Mucciolo
2014-06-24T23:59:59.000Z
We study the problem of irreversibility when the dynamical evolution of a many-body system is described by a stochastic quantum circuit. Such evolution is more general than a Hamiltonian one, and since energy levels are not well defined, the well-established connection between the statistical fluctuations of the energy spectrum and irreversibility cannot be made. We show that the entanglement spectrum provides a more general connection. Irreversibility is marked by a failure of a disentangling algorithm and is preceded by the appearance of Wigner-Dyson statistical fluctuations in the entanglement spectrum. This analysis can be done at the wave-function level and offers an alternative route to study quantum chaos and quantum integrability.
An analysis of sentencing for federal administration of justice offenses, 1991-1992
Carson, Kimberley S
1996-01-01T23:59:59.000Z
the fact, and felony misconduct or neglect of duty. Equality under the law and the goal of sentencing guidelines formally defined sentencing disparity linked to defendant characteristics as illegal. A multivariate analysis is conducted to test hypotheses...
Tripathi, Markandey M.; Krishnan, Sundar R.; Srinivasan, Kalyan K.; Yueh, Fang-Yu; Singh, Jagdish P.
2012-03-01T23:59:59.000Z
Chemiluminescence emissions from OH*, CH*, C2, and CO2 formed within the reaction zone of premixed flames depend upon the fuel-air equivalence ratio in the burning mixture. In the present paper, a new partial least square regression (PLS-R) based multivariate sensing methodology is investigated and compared with an OH*/CH* intensity ratio-based calibration model for sensing equivalence ratio in atmospheric methane-air premixed flames. Five replications of spectral data at nine different equivalence ratios ranging from 0.73 to 1.48 were used in the calibration of both models. During model development, the PLS-R model was initially validated with the calibration data set using the leave-one-out cross validation technique. Since the PLS-R model used the entire raw spectral intensities, it did not need the nonlinear background subtraction of CO2 emission that is required for typical OH*/CH* intensity ratio calibrations. An unbiased spectral data set (not used in the PLS-R model development), for 28 different equivalence ratio conditions ranging from 0.71 to 1.67, was used to predict equivalence ratios using the PLS-R and the intensity ratio calibration models. It was found that the equivalence ratios predicted with the PLS-R based multivariate calibration model matched the experimentally measured equivalence ratios within 7%; whereas, the OH*/CH* intensity ratio calibration grossly underpredicted equivalence ratios in comparison to measured equivalence ratios, especially under rich conditions ( > 1.2). The practical implications of the chemiluminescence-based multivariate equivalence ratio sensing methodology are also discussed.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
, science, and engineering students. Other than the basic probability theory, my goal was to in- cludeProbability, Statistics, and Stochastic Processes Peter Olofsson A Wiley-Interscience Publication had been teaching a course on calculus-based probability and statistics mainly for mathematics
Key China Energy Statistics 2012
Levine, Mark
2013-01-01T23:59:59.000Z
Total Crude Oil Imports: 239 Mt World's Oil Consumption (consumption - Urban Statistical Difference Appendix 3: Energy Balance/China 2010 (cont’d) Mtce Crude Oilconsumption - Urban Other Statistical Difference Appendix 3: Energy Balance/China 2010 (cont’d) Physical Quantity Crude Oil
Practical Statistics for the LHC
Cranmer, Kyle
2015-01-01T23:59:59.000Z
This document is a pedagogical introduction to statistics for particle physics. Emphasis is placed on the terminology, concepts, and methods being used at the Large Hadron Collider. The document addresses both the statistical tests applied to a model of the data and the modeling itself.
December 2000 A STATISTICAL TEST
December 2000 A STATISTICAL TEST SUITE FOR RANDOM AND PSEUDORANDOM NUMBER GENERATORS challenges in authentication protocols. NIST Special Publication (SP) 800-22, A Statistical Test Suite testing of random number and pseudorandom number generators (RNGs and PRNGs) that may be used for many
STATISTICAL FILTERING* John B. Moore~
Moore, John Barratt
noise signals. Theoretical developments in statistical filtering have been made side by side linkages will be made between these two major in filtering. in the developments In the classical approaches and the unwanted signals (noise) lie in another with possibly some overlap. In the statistical approach, the best
Building statistical models by visualization
Minka,Tom
books Â· "The Elements of Graphing Data", William Cleveland, 2nd Ed. Â· "Visualizing Data", WilliamBuilding statistical models by visualization Tom Minka CMU Statistics Dept #12;Outline-scatterplot for unpaired data Â· Quantile of x = fraction of points
Inference for Clustered Mixed Outcomes from a Multivariate Generalized Linear Mixed Model
Chen, Hsiang-Chun
2013-08-01T23:59:59.000Z
) and E(?i2t?) with their marginal expectations over X, ??1 = EX {E(?i1t)} and ??2 = EX {E(?i2t)}, which are shown in the previous subsections. In other words, the overall total-CC is ?total = KtotalN,1,2 (??1, ??2) KtotalD,1,2 (??1, ??2) . 3.2.4....2 Multivariate Generalized Linear Mixed Model . . . . . . . . . . . . . 6 2.3 Assessing Correlation in Generalized Linear Mixed Model . . . . . . . 8 2.4 Bayesian Method for the Generalized Linear Mixed Model . . . . . . 10 3. ASSESSING CORRELATION...
ENTROPY: A COUNTERPART IN STATISTICAL ENERGY A. Le Bot, A. Carbonelli, J. Perret-Liaudet
Paris-Sud XI, Université de
to the energy level. ICSV18, 1014 July 2011, Rio de Janeiro, Brazil 1 #12;18th International Congress on SoundENTROPY: A COUNTERPART IN STATISTICAL ENERGY ANALYSIS A. Le Bot, A. Carbonelli, J. Perret. Statistical energy analysis (SEA) is the most famous method intended to prediction of sound and vibration
Olsen Jr., Dan R.
of Statistics For students entering the degree program during the 20142015 curricular year. UNIVERSITY CORE Introduction to R Programming Stat 124 SAS Base Programming Skills Stat 223 Applied R Programming Stat 224 for Modeling Stat 497R Introduction to Statistical Research Stat 500 Business Career Essentials Stat 538
Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant
Tsai, Alex; Banta, Larry; Tucker, D.A.; Gemmen, R.S.
2008-06-01T23:59:59.000Z
This paper presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The facility provides for the testing and simulation of different fuel cell models that in turn help identify the key issues encountered in the transient operation of such systems. An empirical model of the facility consisting of a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in Transfer Function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H-Infinity robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence.
DATA MONITORING AND ANALYSIS PROGRAM MANUAL
Gravois, Melanie
2007-01-01T23:59:59.000Z
charts and analysis guidelines can be found in common textbooks and guidebooks on Statistical Process Control (SPC).
FISHERY STATISTICS QF THE UNITED STATES
I FISHERY STATISTICS QF THE UNITED STATES 1942 By A. W, ANDERSON and E. A. POWER STATISTICAL DIGEST Statistical Digest No. 11 FISHERY STATISTICS OF THE UNITED STATES 1942 BY A. W. ANDERSON and E. A. POWER. S. Government Printing Offic Washington 25, D. C. - Price 60 cents #12;Fishery Statistics
Statistics in Linguistics Tutorial Just a sip. . .
Fong, Sandiway
Statistics in Linguistics Tutorial Just a sip. . . Mike Hammond Linguistics, U. of Arizona Statistics/Hammond p.1/9 #12;Overview Statistics/Hammond p.2/9 #12;Overview Are our data categorical? Statistics/Hammond p.2/9 #12;Overview¡ Are our data categorical? ¡ Typological claims Statistics
FISHERY STATISTICS OF THE UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1964 STATISTICAL DIGEST NO. 58 UNITED STATES DEPARTMENT Bureau of Commercial Fisheries, Donald L. McKernan, Director STATISTICAL DIGEST 58 FISHERY STATISTICS.S. Government Printing Office, Washington, D.C., 20402 - Price S2.50 (paper cover) #12;Fishery statistics
FISHERY STATISTICS OF THE UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1963 STATISTICAL DIGEST NO. 57 UNITED STATES DEPARTMENT of Commercial Fisheries, Donald L. McKernan, Director STATISTICAL DIGEST 57 FISHERY STATISTICS OF THE UNITEDTernment Printing Office, Washington, D.C., 20402 - Price $2.25 (paper c #12;Fishery statistics of the United States
FISHERY STATISTICS OF THE UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1962 STATISTICAL DIGEST NO. 56 UNITED STATES DEPARTMENT Fisheries, Donald L. McKernan, Director STATISTICAL DIGEST 56 FISHERY STATISTICS OF THE UNITED STATES 1962.S. Government Printing Office, Washington, D.C., 20402 - Price $2.25 (paper cover) #12;Fishery statistics
FISHERY STATISTICS OF THE UNITED STATES
Div,, . FISHERY STATISTICS OF THE UNITED STATES 1961 STATISTICAL DIGEST NO. 54 UNITED STATES, Donald L. MeKernan, Director STATISTICAL DIGEST 54 FISHERY STATISTICS OF THE UNITED STATES 1961 BY E. A, Washington, D.C. 20402 - Price $2 (paper cover) #12;Fishery statistics of the United States are compiled
FISHERY STATISTICS OF THE UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1944 STATISTICAL DIGEST ISO. 16 Fish and Wildlife Sekvh Albert M. Day, Director Statistical Digest No. 16 FISHERY STATISTICS OF THE UNITED STATES 1944 BY A. W Statistics of the United States and Alaska are coiip i I ed and published annually to make available
FISHERY STATISTICS OF THE UNITED STATES
pa%Mv--. FISHERY STATISTICS OF THE UNITED STATES 1965 STATISTICAL DIGEST NO. 59 UNITED STATES, Commissioner Bureau of Commercial Fisheries, H. E. Crowther, Director STATISTICAL DIGEST 59 FISHERY STATISTICS.S. Government Printing Office Washington, D.C. 20402 - Price $4 (Paper Cover) #12;Fishery statistics
FISHERY STATISTICS )F THE UNITED STATES
SH 11 .A443X FISH FISHERY STATISTICS )F THE UNITED STATES ^M=^. STATISTICAL DIGEST NO. 36 #12. Farley, Director i]EL M. COHEN Statistical Digest 36 FISHERY STATISTICS OF THE UNITED STATES 1953 BY A. W;Fishery Statistics of the I'nited States and Alaska are compiled and published an- nually to make
FISHERY STATISTICS OF THE UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1950 STATISTICAL DIGEST NO. 27 Fish and Wildlife ServiceKay, Secretary FISH AND WILDLIFE SERVICE, John L. Farley, Director Statistical Digest 27 FISHERY STATISTICS 25, DC. - - Price $2.00 (paper) #12;Fishery Statistics of the United States and A] aska are corapi
FISHERY STATISTICS OF THE UNITED STATES
SH 11 A443X FISH FISHERY STATISTICS OF THE UNITED STATES 1943 STATISTICAL DIGEST NO. 14 Sll \\M AND WILDLIFE SERVICE Albert M. Day, Director CAMEL M. COHEN Statistical Digest No. 14 FISHERY STATISTICS. - Price 75 cents #12;Fishery Statistics of the United States and Alaska are compiled and published
Quantum particles from classical statistics
C. Wetterich
2010-02-11T23:59:59.000Z
Quantum particles and classical particles are described in a common setting of classical statistical physics. The property of a particle being "classical" or "quantum" ceases to be a basic conceptual difference. The dynamics differs, however, between quantum and classical particles. We describe position, motion and correlations of a quantum particle in terms of observables in a classical statistical ensemble. On the other side, we also construct explicitly the quantum formalism with wave function and Hamiltonian for classical particles. For a suitable time evolution of the classical probabilities and a suitable choice of observables all features of a quantum particle in a potential can be derived from classical statistics, including interference and tunneling. Besides conceptual advances, the treatment of classical and quantum particles in a common formalism could lead to interesting cross-fertilization between classical statistics and quantum physics.
Statistical mechanics of gene competition
Venegas-Ortiz, Juan; Ortiz, Juan Venegas
2013-11-28T23:59:59.000Z
Statistical mechanics has been applied to a wide range of systems in physics, biology, medicine and even anthropology. This theory has been recently used to model the complex biochemical processes of gene expression and ...
Forensic Statistics: Ready for consumption?
Gill, Richard D.
in de rechtszaal. Stator. http://www.kennislink.nl/web/show?id=111865 #12;Everyday statistics · Intensive two-way interaction between statistician and subject-matter expert (client) Cyclic process of re
Anyonic statistics with continuous variables
Jing Zhang; Changde Xie; Kunchi Peng; Peter van Loock
2008-10-30T23:59:59.000Z
We describe a continuous-variable scheme for simulating the Kitaev lattice model and for detecting statistics of abelian anyons. The corresponding quantum optical implementation is solely based upon Gaussian resource states and Gaussian operations, hence allowing for a highly efficient creation, manipulation, and detection of anyons. This approach extends our understanding of the control and application of anyons and it leads to the possibility for experimental proof-of-principle demonstrations of anyonic statistics using continuous-variable systems.
Issues in International Energy Consumption Analysis: Electricity...
U.S. Energy Information Administration (EIA) Indexed Site
Issues in International Energy Consumption Analysis: Electricity Usage in India's Housing Sector November 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of...
Faculty of Science Mathematics and Statistics
Faculty of Science Mathematics and Statistics Mathematicians and statisticians use powerful mathematical, statistical and computational tools to solve both important theoretical problems and practical can pursue programs in mathematics, statistics, actuarial science, math and computer science
Gary Christopher Vezzoli
2001-04-04T23:59:59.000Z
This work presents physical consequences of our theory of induced gravity (Ref.1) regarding: 1) the requirement to consider shape and materials properties when calculating graviton cross section collision area; 2) use of Special Relativity; 3) implications regarding the shape of cosmos; 4) comparison to explanations using General Relativity; 5) properties of black holes; 6) relationship to the strong force and the theorized Higgs boson; 7) the possible origin of magnetic attraction; 8) new measurements showing variation from gravitational inverse square behavior at length scales of 0.1 mm and relationship to the Cosmological constant, and proof of the statistical time properties of the gravitational interaction.