Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
6 Statistical Sciences Applying statistical reasoning and rigor to multidisciplinary scientific investigations Contact Us Group Leader Joanne Wendelberger Email Deputy Group Leader James R. Gattiker Email Group Administrator LeeAnn Martinez (505) 667-3308 Email Statistical Sciences Statistical Sciences provides statistical reasoning and rigor to multidisciplinary scientific investigations and development, application, and communication of cutting-edge statistical sciences research. Statistical
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Usage Statistics Usage Statistics Genepool Cluster Statistics Period: daily weekly monthly quarter yearly 2year Utilization By Group Jobs Pending Last edited: 2013-09-26 18:21:13...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Downtime Log Yearly Operation Statistics 2016 Statistics 2015 Statistics 2014 Statistics 2013 Statistics 2012 Statistics 2011 Statistics 2010 Statistics 2009 Statistics 2008...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics Cluster Statistics Ganglia Ganglia can be used to monitor performance of PDSF nodes... Read More PDSF IO Monitoring This page shows the IO response of the elizas and...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Genepool Memory Heatmaps Usage Statistics UGE Scheduler Cycle Time File storage and I/O Data Management Supported Systems FAQ Performance and Optimization Genepool Completed Jobs Genepool Training and Tutorials Websites, databases and cluster services Testbeds Retired Systems Storage & File Systems Data & Analytics Connecting to NERSC Queues and Scheduling Job Logs & Statistics Application Performance Training & Tutorials Software Policies User Surveys NERSC Users Group User
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2011 FY 2012 FY 2013 Current Enacted Congressional Approp. Approp. * Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy........................................ 1,771,721 1,809,638 2,337,000 +527,362 +29.1% Electricity delivery and energy reliability.........................................
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2007 FY 2008 FY 2009 Current Current Congressional Op. Plan Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy.......................... -- 1,722,407 1,255,393 -467,014 -27.1% Electricity delivery and energy reliability........................... -- 138,556 134,000 -4,556 -3.3% Nuclear
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ARMFacility Statistics 2015 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Historical Statistics Field Campaigns Operational...
Prohaska, R.; Duran, A.; Ragatz, A.; Kelly, K.
2015-05-03
With funding from the U.S. Department of Energy’s Vehicle Technologies Office, the National Renewable Energy Laboratory (NREL) conducts real-world performance evaluations of advanced medium- and heavy-duty fleet vehicles. Evaluation results can help vehicle manufacturers fine-tune their designs and assist fleet managers in selecting fuel-efficient, low-emission vehicles that meet their economic and operational goals. In 2011, NREL launched a large-scale performance evaluation of medium-duty electric vehicles. With support from vehicle manufacturers Smith and Navistar, NREL research focused on characterizing vehicle operation and drive cycles for electric delivery vehicles operating in commercial service across the nation.
Broader source: Energy.gov [DOE]
This page provides EERE Web statistics for all office and corporate websites that opted to use EERE's analytics account. Webtrends statistics for Fiscal Year 2009 (FY09) to FY11 are available for...
Office of Survey Development and Statistical Integration
Gasoline and Diesel Fuel Update (EIA)
Steve Harvey April 27, 2011 | Washington, D.C. Tough Choices in U.S. EIA's Data Programs Agenda * Office of Oil, Gas, and Coal Supply Statistics * Office of Petroleum and Biofuels Statistics * Office of Electricity, Renewables, and Uranium Statistics * Office of Energy Consumption and Efficiency Statistics * Office of Survey Development and Statistical Integration 2 Presenter name, Presentation location, Presentation date Coal Data Collection Program 3 James Kendell Washington, DC, April 27,
ARM - Historical Operational Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Operational Statistics 2016 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Past Quarterly Reports Historical Statistics Field Campaigns Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Operational Statistics The reporting requirements for DOE national user facilities are based on time. These requirements concern the actual hours of operation (ACTUAL) and the established maximum operation or uptime
ARM - Historical Visitor Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Visitor Statistics 2016 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Past Quarterly Reports Historical Statistics Field Campaigns Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Visitor Statistics As a national user facility, ARM is required to report facility use for actual visitors and for active user research computer and Archive accounts and only unique scientific users will be counted. A
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
October 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report April - June 2014 This report was...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Table of Contents Summary...................................................................................................... 1 Mandatory Funding....................................................................................... 3 Energy Supply.............................................................................................. 4 Non-Defense site acceleration
International Energy Statistics - EIA
Gasoline and Diesel Fuel Update (EIA)
International > International Energy Statistics International Energy Statistics Petroleum Production | Annual Monthly/Quarterly Consumption | Annual Monthly/Quarterly Capacity | Bunker Fuels | Stocks | Annual Monthly/Quarterly Reserves | Imports | Annual Monthly/Quarterly Exports | CO2 Emissions | Heat Content Natural Gas All Flows | Production | Consumption | Reserves | Imports | Exports | Carbon Dioxide Emissions | Heat Content Coal All Flows | Production | Consumption | Reserves | Imports
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
AMERICAN STATISTICAL ASSOCIATION + + + + + COMMITTEE ON ENERGY STATISTICS + + + + + FALL MEETING + + + + + FRIDAY OCTOBER 17, 2003 + + + + + The Committee met in Room 8E089 in the Forrestal Building, 1000 Independence Avenue, S.W., Washington, D.C., at 8:30 a.m., Jay Breidt, Chair, presiding. PRESENT F. JAY BREIDT Chair NICOLAS HENGARTNER Vice Chair JOHNNY BLAIR Committee Member MARK BURTON Committee Member JAE EDMONDS Committee Member MOSHE FEDER Committee Member JAMES K. HAMMITT Committee
Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
April 2015 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Annual Coal Distribution Report 2013 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as
Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Report (Abbreviated) July - September 2015 January 2016 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore
Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
March 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report October - December 2014 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should
Candidate Assembly Statistical Evaluation
Energy Science and Technology Software Center (OSTI)
1998-07-15
The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amoreÂ Â» significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.Â«Â less
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2004 FY 2005 FY 2006 Comparable Comparable Request to FY 2006 vs. FY 2005 Approp Approp Congress Discretionary Summary By Appropriation Energy And Water Development Appropriation Summary: Energy Programs Energy supply Operation and maintenance................................................. 787,941 909,903 862,499 -47,404 -5.2% Construction......................................................................... 6,956
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2005 FY 2006 FY 2007 Current Current Congressional Approp. Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy supply and conservation Operation and maintenance............................................ 1,779,399 1,791,372 1,917,331 +125,959 +7.0%
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2006 FY 2007 FY 2008 Current Congressional Congressional Approp. Request Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy supply and conservation Operation and maintenance........................................... 1,781,242 1,917,331 2,187,943 +270,612 +14.1%
Implementing Bayesian Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Implementing Bayesian Statistics and a Full Systematic Uncertainty Propagation with the Soft X-Ray Tomography Diagnostic on the Madison Symmetric Torus by Jay Johnson A thesis submitted in partial fulfillment of the requirements for the degree of Bachelors of Science (Physics) at the University of Wisconsin - Madison 2013 i Abstract The Madison Symmetric Torus uses multiple diagnostics to measure electron temper- ature (T e ). The soft x-ray (SXR) diagnostic measures T e from x-ray emission in
Experimental Mathematics and Computational Statistics
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
International petroleum statistics report
1995-10-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
International petroleum statistics report
1997-05-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
Sandia Energy - Statistical Mechanics with Density Functional...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistical Mechanics with Density Functional Theory Accuracy Home Highlights - HPC Statistical Mechanics with Density Functional Theory Accuracy Previous Next Statistical...
Statistical physics ""Beyond equilibrium
Ecke, Robert E
2009-01-01
The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.
statistics Home Rmckeel's picture Submitted by Rmckeel(297) Contributor 8 November, 2012 - 12:58 OpenEI dashboard Google Analytics mediawiki OpenEI statistics wiki OpenEI web...
FY 2015 Statistical Table by Appropriation
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Statistical Table by Appropriation Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustment Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Appropriation Energy And Water Development And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy............................... 1,691,757 1,900,641 ---- 1,900,641
International petroleum statistics report
1995-07-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report
1995-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report
1996-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
International petroleum statistics report
1996-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.
International petroleum statistics report
1997-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
ARM - Historical Field Campaign Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Field Campaign Statistics 2016 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Past Quarterly Reports Historical Statistics Field Campaigns Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Field Campaign Statistics ARM Climate Research Facility users regularly conduct field campaigns to augment routine data acquisitions and to test and validate new instruments. Since the ARM Program began operations
Moore named an American Statistical Society Fellow
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Moore named an American Statistical Society Fellow Moore named an American Statistical Society Fellow The ASA inducted Leslie (Lisa) Moore as a Fellow at the 2014 Joint Statistical...
Computing contingency statistics in parallel.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-09-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Key China Energy Statistics 2011
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-01-15
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). In 2008 the Group published the Seventh Edition of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
Key China Energy Statistics 2012
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-05-01
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). The Group has published seven editions to date of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
Web Analytics and Statistics | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
User Experience Research & Statistics Â» Web Analytics and Statistics Web Analytics and Statistics EERE uses Google Analytics to capture statistics on its websites. These statistics help website managers measure and report on users, sessions, most visited pages, and more. The Web Template Coordinator can provide you with EERE's username and password and answer questions about your site statistics. Adding Google Analytics to EERE Websites In order for Google Analytics to capture statistics on
QUANTUM MECHANICS WITHOUT STATISTICAL POSTULATES
G. GEIGER; ET AL
2000-11-01
The Bohmian formulation of quantum mechanics describes the measurement process in an intuitive way without a reduction postulate. Due to the chaotic motion of the hidden classical particle all statistical features of quantum mechanics during a sequence of repeated measurements can be derived in the framework of a deterministic single system theory.
Ideas for Effective Communication of Statistical Results
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Anderson-Cook, Christine M.
2015-03-01
Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
appropriate methods of statistical analysis to a variety of problems in occupational health and other areas. Our expertise spans a range of capabilities essential for statistical...
VTPI-Transportation Statistics | Open Energy Information
Area: Transportation Resource Type: Dataset Website: www.vtpi.orgtdmtdm80.htm Cost: Free VTPI-Transportation Statistics Screenshot References: VTPI-Transportation Statistics1...
IEA Energy Statistics | Open Energy Information
Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: IEA Energy Statistics AgencyCompany Organization: International Energy Agency Sector: Energy Topics: GHG...
Statistically significant relational data mining :
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
American Statistical Association Award Moore honored with American Statistical Association award Lisa Moore is the recipient of the 2013 Don Owen Award presented by the American Statistical Association, San Antonio Chapter. May 24, 2013 Leslie "Lisa" Moore Leslie "Lisa" Moore The American Statistical Association (ASA) is the world's largest community of statisticians. It was founded in Massachusetts in 1839. Leslie "Lisa" Moore of the Laboratory's Statistical
Statistics, Uncertainty, and Transmitted Variation
Wendelberger, Joanne Roth
2014-11-05
The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.
Transportation Statistics Annual Report 1997
Fenn, M.
1997-01-01
This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these accessibility patterns? How are commodity flows and transportation services responding to global competition, deregulation, economic restructuring, and new information technologies? How do U.S. patterns of personal mobility and freight movement compare with other advanced industrialized countries, formerly centrally planned economies, and major newly industrializing countries? Finally, how is the rapid adoption of new information technologies influencing the patterns of transportation demand and the supply of new transportation services? Indeed, how are information technologies affecting the nature and organization of transportation services used by individuals and firms?
Lectures on probability and statistics
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Parallel auto-correlative statistics with VTK.
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Tomography and weak lensing statistics
Munshi, Dipak; Coles, Peter; Kilbinger, Martin E-mail: peter.coles@astro.cf.ac.uk
2014-04-01
We provide generic predictions for the lower order cumulants of weak lensing maps, and their correlators for tomographic bins as well as in three dimensions (3D). Using small-angle approximation, we derive the corresponding one- and two-point probability distribution function for the tomographic maps from different bins and for 3D convergence maps. The modelling of weak lensing statistics is obtained by adopting a detailed prescription for the underlying density contrast that involves hierarchal ansatz and lognormal distribution. We study the dependence of our results on cosmological parameters and source distributions corresponding to the realistic surveys such as LSST and DES. We briefly outline how photometric redshift information can be incorporated in our results. We also show how topological properties of convergence maps can be quantified using our results.
STORM: A STatistical Object Representation Model
Rafanelli, M. ); Shoshani, A. )
1989-11-01
In this paper we explore the structure and semantic properties of the entities stored in statistical databases. We call such entities statistical objects'' (SOs) and propose a new statistical object representation model,'' based on a graph representation. We identify a number of SO representational problems in current models and propose a methodology for their solution. 11 refs.
Quantrum chaos and statistical nuclear physics
Not Available
1986-01-01
This book contains 33 selections. Some of the titles are: Chaotic motion and statistical nuclear theory; Test of spectrum and strength fluctuations with proton resonances; Nuclear level densities and level spacing distributions; Spectral statistics of scale invariant systems; and Antiunitary symmetries and energy level statistics.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
I/O Statistics Last 30 Days I/O Statistics Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files transferred. Daily I/O Volume Daily I/O Count
Energy Statistics, Third Quarter, 1991
Not Available
1991-01-01
Data are presented on 104 tables on the following energy sources: petroleum, coal, natural gas, biomass, and electric power by hydroelectric, nuclear, and renewable sources (wood, waste, geothermal, wind, photovoltaic, and solar thermal). Tables include information on imports, energy production, energy consumption, prices, well drilling, seismic survey activity, pipeline mileage, reserves, energy supplies, storage facilities, exports, residential sector use, regional analyses, and business indicators (e.g., price indexes, balance of trade, exchange rates). Gas liquids, petroleum products, and peat information is included.
Statistics for characterizing data on the periphery
Theiler, James P; Hush, Donald R
2010-01-01
We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.
Statistical assessment of Monte Carlo distributional tallies
Kiedrowski, Brian C; Solomon, Clell J
2010-12-09
Four tests are developed to assess the statistical reliability of distributional or mesh tallies. To this end, the relative variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality.
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Before his death in 1991, Professor Owen was the Distinguished Professor of Statistics at Southern Methodist University in Dallas, Texas. His illustrious career serves as the ...
Statistics for Industry Groups and Industries, 2003
2009-01-18
Statistics for the U.S. Department of Commerce including types of manufacturing, employees, and products as outlined in the Annual Survey of Manufacturers (ASM).
Statistical methods for nuclear material management
Bowen W.M.; Bennett, C.A.
1988-12-01
This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems.
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...
Office of Scientific and Technical Information (OSTI)
Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Authors: Harris, S. ; Gross, R. ; Watson, H. Publication ...
Federal offshore statistics: leasing - exploration - production - revenue
Essertier, E.P.
1984-01-01
Federal Offshore Statistics is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the Federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Statistics are presented under the following topics: (1) highlights, (2) leasing, (3) exploration and development, (4) production and revenue, (5) federal offshore production by ranking operator, 1983, (6) reserves and undiscovered recoverable resources, and (7) oil pollution in the world's oceans.
Topology for statistical modeling of petascale data.
Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice
2011-07-01
This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
Statistical criteria for characterizing irradiance time series.
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
EERE Web Site Engagement Statistics: FY09
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
WEB SITE ENGAGEMENT STATISTICS TECHNOLOGY ADVANCEMENT AND OUTREACH | 01 TABLE OF CONTENTS ... Views 02 Average Visit Duration 03 Top 20 Web Sites by Visits 03 Top 20 Visited Pages 04 ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files...
Statistical Methods for Environmental Pollution Monitoring
Office of Scientific and Technical Information (OSTI)
f!\Jl~~ If & &0 :3 Statistical Methods for Environmental Pollution Monitoring 3 3679 00058 9400 Statistical Methods for Environmental Pollution Monitoring Richard O. Gilbert Pacific Northwest Laboratory Imi5l VAN NOSTRAND REINHOLD COMPANY ~ - - - - - - - New York Dedicated to my parents, Mary Margaret and Donald I. Gilbert Copyright Â© 1987 by Van Nostrand Reinhold Company Inc. Library of Congress Catalog Card Number: 86-26758 ISBN 0-442-23050-8 Work supported by the U.S. Department of
Combined statistical and dynamical assessment of simulated
Office of Scientific and Technical Information (OSTI)
vegetation-rainfall in North Africa during the mid-Holocene* (Journal Article) | SciTech Connect Combined statistical and dynamical assessment of simulated vegetation-rainfall in North Africa during the mid-Holocene* Citation Details In-Document Search Title: Combined statistical and dynamical assessment of simulated vegetation-rainfall in North Africa during the mid-Holocene* A negative feedback of vegetation cover on subsequent annual precipitation is simulated for the mid-Holocene over
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistical Analyses Statistical analyses at the Oak Ridge Institute for Science and Education (ORISE) support ongoing programs involving medical surveillance of workers and other populations, as well as occupational epidemiology and research. ORISE emphasizes insightful and accurate analysis, practical interpretation of results and clear, easily read reports. All analyses are preceded by extensive data scrubbing and verification. ORISE's approach relies on applying appropriate methods of
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmore »degraded.« less
ARM - Lesson Plans: Historical Climate Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Historical Climate Statistics Outreach Home Room News Publications Traditional Knowledge Kiosks Barrow, Alaska Tropical Western Pacific Site Tours Contacts Students Study Hall About ARM Global Warming FAQ Just for Fun Meet our Friends Cool Sites Teachers Teachers' Toolbox Lesson Plans Lesson Plans: Historical Climate Statistics Objective The objective of this activity is to demonstrate the concept of climate change at a sample locality where the historical temperature records are available.
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmoreÂ Â» degraded.Â«Â less
DOE - NNSA/NFO -- FOIA Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics NNSA/NFO Language Options U.S. DOE/NNSA - Nevada Field Office FOIA Statistics The FOIA has become a useful tool for researchers, news media, and the general public. In October 1996, Congress enacted the Electronic Freedom of Information Act Amendments of 1996, one of which (5 U.S.C. 552(a)(6)(A)(i)) extended the agency response period from 10 days to 20 days, and another (5 U.S.C. 552(e)) which requires agencies to make available to the public fiscal year FOIA Annual Reports. This
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1983-01-01
The statistics in this update of the Outer Continental Shelf Statistics publication document what has happened since federal leasing began on the Outer Continental Shelf (OCS) in 1954. Highlights note that of the 29.8 million acres actually leased from 175.6 million acres offered for leasing, 20.1% were in frontier areas. Total revenues for the 1954-1982 period were $58.9 billion with about 13% received in 1982. The book is divided into six parts covering highlights, leasing, exploration and development, production and revenue, reserves and undiscovered recoverable resources, and pollution problems from well and tanker accidents. 5 figures, 59 tables.
Picard, R.R.
1987-01-01
Many aspects of the MUF-D statistic, used for verification of accountability data, have been examined in the safeguards literature. In this paper, basic MUF-D results are extended to more general environments than are usually considered. These environments include arbitrary measurement error structures, various sampling regimes that could be imposed by the inspectorate, and the attributes/variables framework.
Baseballs and Barrels: World Statistics Day
Broader source: Energy.gov [DOE]
Statistics donâ€™t just help us answer trivia questions â€“ they also help us make intelligent decisions. For example, if I heat my home with natural gas, Iâ€™m probably interested in what natural gas prices are likely to be this winter.
Statistics of dislocation pinning at localized obstacles
Dutta, A.; Bhattacharya, M. Barat, P.
2014-10-14
Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning of dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.
Multifragmentation: New dynamics or old statistics?
Moretto, L.G.; Delis, D.N.; Wozniak, G.J.
1993-10-01
The understanding of the fission process as it has developed over the last fifty years has been applied to multifragmentation. Two salient aspects have been discovered: 1) a strong decoupling of the entrance and exit channels with the formation of well-characterized sources: 2) a statistical competition between two-, three-, four-, five-, ... n-body decays.
Resistive switching phenomena: A review of statistical physics...
Office of Scientific and Technical Information (OSTI)
Resistive switching phenomena: A review of statistical physics approaches Citation Details ... Title: Resistive switching phenomena: A review of statistical physics approaches Authors: ...
Statistical Surrogate Models for Estimating Probability of High...
Office of Scientific and Technical Information (OSTI)
Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change. Citation Details In-Document Search Title: Statistical Surrogate Models for Estimating ...
Statistical and Domain Analytics Applied to PV Module Lifetime...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science...
Final Report on Statistical Debugging for Petascale Environments...
Office of Scientific and Technical Information (OSTI)
Final Report on Statistical Debugging for Petascale Environments Citation Details In-Document Search Title: Final Report on Statistical Debugging for Petascale Environments ...
Statistical Behavior of Formation Process of Magnetic Vortex...
Office of Scientific and Technical Information (OSTI)
Technical Report: Statistical Behavior of Formation Process of Magnetic Vortex State in Ni80Fe20 Nanodisks Citation Details In-Document Search Title: Statistical Behavior of...
Environment/Health/Safety (EHS): Monthly Accident Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Personal Protective Equipment (PPE) Injury Review & Analysis Worker Safety and Health Program: PUB-3851 Monthly Accident Statistics Latest Accident Statistics Accident...
User Statistics Collection Practices Archives | U.S. DOE Office...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Policies and Processes User Statistics Collection Practices User Statistics Collection Practices Archives User Facilities User Facilities Home User Facilities at a Glance...
RITA-Bureau of Transportation Statistics | Open Energy Information
RITA-Bureau of Transportation Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: RITA-Bureau of Transportation Statistics AgencyCompany Organization: United...
Topological Cacti: Visualizing Contour-based Statistics
Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio
2011-05-26
Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introduce a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.
Statistical approach to nuclear level density
Sen'kov, R. A.; Horoi, M.; Zelevinsky, V. G.
2014-10-15
We discuss the level density in a finite many-body system with strong interaction between the constituents. Our primary object of applications is the atomic nucleus but the same techniques can be applied to other mesoscopic systems. We calculate and compare nuclear level densities for given quantum numbers obtained by different methods, such as nuclear shell model (the most successful microscopic approach), our main instrument - moments method (statistical approach), and Fermi-gas model; the calculation with the moments method can use any shell-model Hamiltonian excluding the spurious states of the center-of-mass motion. Our goal is to investigate statistical properties of nuclear level density, define its phenomenological parameters, and offer an affordable and reliable way of calculation.
Robust statistical reconstruction for charged particle tomography
2013-10-08
Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.
Independent Statistics & Analysis Drilling Productivity Report
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Independent Statistics & Analysis Drilling Productivity Report The seven regions analyzed in this report accounted for 92% of domestic oil production growth and all domestic natural gas production growth during 2011-14. March 2016 For key tight oil and shale gas regions U.S. Energy Information Administration Contents Year-over-year summary 2 Bakken Region 3 Eagle Ford Region 4 Haynesville Region 5 Marcellus Region 6 Niobrara Region 7 Permian Region 8 Utica Region 9 Explanatory notes 10
Workforce Statistics | National Nuclear Security Administration
National Nuclear Security Administration (NNSA)
Workforce Statistics | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing Proliferation Powering the Nuclear Navy Emergency Response Recapitalizing Our Infrastructure Countering Nuclear Terrorism About Our Programs Our History Who We Are Our Leadership Our Locations Budget Our Operations Library Bios Congressional Testimony Fact Sheets Newsletters Press Releases Photo Gallery Jobs Apply for Our Jobs Our Jobs Working at
Statistical design of a uranium corrosion experiment
Wendelberger, Joanne R; Moore, Leslie M
2009-01-01
This work supports an experiment being conducted by Roland Schulze and Mary Ann Hill to study hydride formation, one of the most important forms of corrosion observed in uranium and uranium alloys. The study goals and objectives are described in Schulze and Hill (2008), and the work described here focuses on development of a statistical experiment plan being used for the study. The results of this study will contribute to the development of a uranium hydriding model for use in lifetime prediction models. A parametric study of the effect of hydrogen pressure, gap size and abrasion on hydride initiation and growth is being planned where results can be analyzed statistically to determine individual effects as well as multi-variable interactions. Input to ESC from this experiment will include expected hydride nucleation, size, distribution, and volume on various uranium surface situations (geometry) as a function of age. This study will also address the effect of hydrogen threshold pressure on corrosion nucleation and the effect of oxide abrasion/breach on hydriding processes. Statistical experiment plans provide for efficient collection of data that aids in understanding the impact of specific experiment factors on initiation and growth of corrosion. The experiment planning methods used here also allow for robust data collection accommodating other sources of variation such as the density of inclusions, assumed to vary linearly along the cast rods from which samples are obtained.
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1984-09-01
This publication is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Some of the highlights are: of the 329.5 million acres offered for leasing, 37.1 million acres were actually leased; total revenues for the 1954 to 1983 period were $68,173,112,563 and for 1983 $9,161,435,540; a total of 22,095 wells were drilled in federal waters and 10,145 wells were drilled in state waters; from 1954 through 1983, federal offshore areas produced 6.4 billion barrels of oil and condensate, and 62.1 trillion cubic feet of natural gas; in 1983 alone production was 340.7 million barrels of oil and condensate, and 3.9 trillion cubic feet of gas; and for the second straight year, no oil was lost in 1983 as a result of blowouts in federal waters. 8 figures, 66 tables.
Weatherization Assistance Program - Background Data and Statistics
Eisenberg, Joel Fred
2010-03-01
This technical memorandum is intended to provide readers with information that may be useful in understanding the purposes, performance, and outcomes of the Department of Energy's (DOE's) Weatherization Assistance Program (Weatherization). Weatherization has been in operation for over thirty years and is the nation's largest single residential energy efficiency program. Its primary purpose, established by law, is 'to increase the energy efficiency of dwellings owned or occupied by low-income persons, reduce their total residential energy expenditures, and improve their health and safety, especially low-income persons who are particularly vulnerable such as the elderly, the handicapped, and children.' The American Reinvestment and Recovery Act PL111-5 (ARRA), passed and signed into law in February 2009, committed $5 Billion over two years to an expanded Weatherization Assistance Program. This has created substantial interest in the program, the population it serves, the energy and cost savings it produces, and its cost-effectiveness. This memorandum is intended to address the need for this kind of information. Statistically valid answers to many of the questions surrounding Weatherization and its performance require comprehensive evaluation of the program. DOE is undertaking precisely this kind of independent evaluation in order to ascertain program effectiveness and to improve its performance. Results of this evaluation effort will begin to emerge in late 2010 and 2011, but they require substantial time and effort. In the meantime, the data and statistics in this memorandum can provide reasonable and transparent estimates of key program characteristics. The memorandum is laid out in three sections. The first deals with some key characteristics describing low-income energy consumption and expenditures. The second section provides estimates of energy savings and energy bill reductions that the program can reasonably be presumed to be producing. The third section deals with estimates of program cost-effectiveness and societal impacts such as carbon reduction and reduced national energy consumption. Each of the sections is brief, containing statistics, explanatory graphics and tables as appropriate, and short explanations of the statistics in order to place them in context for the reader. The companion appendices at the back of the memorandum explain the methods and sources used in developing the statistics.
Statistical simulation ?of the magnetorotational dynamo
Squire, J.; Bhattacharjee, A.
2014-08-01
We analyze turbulence and dynamo induced by the magnetorotational instability (MRI) using quasi-linear statistical simulation methods. We find that homogenous turbulence is unstable to a large scale dynamo instability, which saturates to an inhomogenous equilibrium with a very strong dependence on the magnetic Prandtl number (Pm). Despite its enormously reduced nonlinearity, the quasi-linear model exhibits the same qualitative scaling of angular momentum transport with Pm as fully nonlinear turbulence. This demonstrates the relationship of recent convergence problems to the large scale dynamo and suggests possible methods for studying astrophysically relevant regimes at very low or high Pm.
Statistical analysis of random duration times
Engelhardt, M.E.
1996-04-01
This report presents basic statistical methods for analyzing data obtained by observing random time durations. It gives nonparametric estimates of the cumulative distribution function, reliability function and cumulative hazard function. These results can be applied with either complete or censored data. Several models which are commonly used with time data are discussed, and methods for model checking and goodness-of-fit tests are discussed. Maximum likelihood estimates and confidence limits are given for the various models considered. Some results for situations where repeated durations such as repairable systems are also discussed.
[pic] EERE Web Site Statistics - Social Media
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
EERE Web Site Statistics - Social Media Custom View: 10/1/10 - 9/30/11 October 1, 2010 12:00:00 AM - September 30, 2011 11:59:59 PM Table of Contents Overview Dashboard 3 By Number of Visits 4 Domain Names 23 Top-Level Domain Types 26 Countries 29 Visits Trend 32 Visits by Number of Pages Viewed 34 Visit Duration by Visits 36 Visit Duration by Page Views 39 Pages 42 Page Views Trend 45 File Downloads 47 Entry Pages 48 Exit Pages 50 Single-Page Visits 53 Paths, Forward 56 Referring Site 84
[pic] EERE Web Site Statistics - Multimedia
Broader source: Energy.gov (indexed) [DOE]
EERE Web Site Statistics - Multimedia Custom View: 10/1/10 - 9/30/11 October 1, 2010 12:00:00 AM - September 30, 2011 11:59:59 PM Table of Contents Overview Dashboard 3 By Number of Visits 4 Domain Names 17 Top-Level Domain Types 20 Countries 23 Visits Trend 26 Visits by Number of Pages Viewed 28 Visit Duration by Visits 31 Visit Duration by Page Views 34 Pages 37 Page Views Trend 43 File Downloads 45 Entry Pages 46 Exit Pages 52 Single-Page Visits 59 Paths, Forward 66 Referring Site 90
Lectures on probability and statistics. Revision
Yost, G.P.
1985-06-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.
International energy indicators. [Statistical tables and graphs
Bauer, E.K.
1980-05-01
International statistical tables and graphs are given for the following: (1) Iran - Crude Oil Capacity, Production and Shut-in, June 1974-April 1980; (2) Saudi Arabia - Crude Oil Capacity, Production, and Shut-in, March 1974-Apr 1980; (3) OPEC (Ex-Iran and Saudi Arabia) - Capacity, Production and Shut-in, June 1974-March 1980; (4) Non-OPEC Free World and US Production of Crude Oil, January 1973-February 1980; (5) Oil Stocks - Free World, US, Japan, and Europe (Landed, 1973-1st Quarter, 1980); (6) Petroleum Consumption by Industrial Countries, January 1973-December 1979; (7) USSR Crude Oil Production and Exports, January 1974-April 1980; and (8) Free World and US Nuclear Generation Capacity, January 1973-March 1980. Similar statistical tables and graphs included for the United States include: (1) Imports of Crude Oil and Products, January 1973-April 1980; (2) Landed Cost of Saudi Oil in Current and 1974 Dollars, April 1974-January 1980; (3) US Trade in Coal, January 1973-March 1980; (4) Summary of US Merchandise Trade, 1976-March 1980; and (5) US Energy/GNP Ratio, 1947 to 1979.
International petroleum statistics report, March 1998
1998-03-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
International petroleum statistics report, July 1999
1999-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 44 tabs.
Statistics and geometry of cosmic voids
Gaite, José
2009-11-01
We introduce new statistical methods for the study of cosmic voids, focusing on the statistics of largest size voids. We distinguish three different types of distributions of voids, namely, Poisson-like, lognormal-like and Pareto-like distributions. The last two distributions are connected with two types of fractal geometry of the matter distribution. Scaling voids with Pareto distribution appear in fractal distributions with box-counting dimension smaller than three (its maximum value), whereas the lognormal void distribution corresponds to multifractals with box-counting dimension equal to three. Moreover, voids of the former type persist in the continuum limit, namely, as the number density of observable objects grows, giving rise to lacunar fractals, whereas voids of the latter type disappear in the continuum limit, giving rise to non-lacunar (multi)fractals. We propose both lacunar and non-lacunar multifractal models of the cosmic web structure of the Universe. A non-lacunar multifractal model is supported by current galaxy surveys as well as cosmological N-body simulations. This model suggests, in particular, that small dark matter halos and, arguably, faint galaxies are present in cosmic voids.
International Petroleum Statistics Report, January 1994
Not Available
1994-01-31
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, November 1993
Not Available
1993-11-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992: and OECD trade from 1982 through 1992.
Statistical correlations in the Moshinsky atom
Laguna, H. G.; Sagar, R. P.
2011-07-15
We study the influence of the interparticle and confining potentials on statistical correlation via the correlation coefficient and mutual information in ground and some excited states of the Moshinsky atom in position and momentum space. The magnitude of the correlation between positions and between momenta is equal in the ground state. In excited states, the correlation between the momenta of the particles is greater than between their positions when they interact through an attractive potential whereas for repulsive interparticle potentials the opposite is true. Shannon entropies, and their sums (entropic formulations of the uncertainty principle), are also analyzed, showing that the one-particle entropy sum is dependent on the interparticle potential and thus able to detect the correlation between particles.
Statistical fingerprinting for malware detection and classification
Prowell, Stacy J.; Rathgeb, Christopher T.
2015-09-15
A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.
International petroleum statistics report, October 1997
1997-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
International Petroleum Statistics Report, July 1994
Not Available
1994-07-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993. Data for the United States are developed by the Energy Information Administration`s (EIA) Office of Oil and Gas. Data for other countries are derived largely from published sources, including International Energy Agency publications, the EIA International Energy Annual, and the trade press. (See sources after each section.) All data are reviewed by the International Statistics Branch of EIA. All data have been converted to units of measurement familiar to the American public. Definitions of oil production and consumption are consistent with other EIA publications.
FY 2014 Budget Request Statistical Table | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table FY 2014 Budget Request Statistical Table PDF icon Stats Table FY2014.pdf More Documents & Publications FY 2009 Environmental Management Budget Request to Congress Fiscal Year 2013 President's Budget Request Fiscal Year 2013 President's
EU Pocketbook - European Vehicle Market Statistics | Open Energy...
- European Vehicle Market Statistics AgencyCompany Organization: International Council on Clean Transportation Website: eupocketbook.theicct.org Transport Toolkit...
2011 Annual Merit Review Results Report - Project and Program Statistics
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistics Calculations Overview 2011 Annual Merit Review Results Report - Project and Program Statistics Calculations Overview Merit review of DOE Vehicle Technologies research activities PDF icon 2011_amr_11.pdf More Documents & Publications 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2012 Annual Merit Review Results Report - Project and Program Statistical Calculations
2012 Annual Merit Review Results Report - Project and Program Statistical
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistical Calculations Overview 2012 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview Merit review of DOE Vehicle Technologies research activities PDF icon 2012_amr_11.pdf More Documents & Publications 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2014 Annual Merit Review Results Report - Project and Program Statistical
2014 Annual Merit Review Results Report - Project and Program Statistical
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistical Calculations Overview 2014 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview Merit review of DOE Vehicle Technologies research activities PDF icon 2014_amr_12.pdf More Documents & Publications 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2012 Annual Merit Review Results Report - Project and Program Statistical
An overview of component qualification using Bayesian statistics...
Office of Scientific and Technical Information (OSTI)
Country of Publication: United States Language: English Subject: 99 GENERAL AND MISCELLANEOUSMATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; LEARNING; STATISTICS; MATHEMATICS ...
Overview of North American Energy Trade Statistics: Methodologies...
U.S. Energy Information Administration (EIA) Indexed Site
Cooperation on Energy Information, Subgroup A: Energy Trade Statistics December 2015 DRAFT December 16, 2015 ......... 10 Electricity exports and imports ...
2013 Annual Merit Review Results Report - Project and Program Statistical
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistical Calculations Overview 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview Merit review of DOE Vehicle Technologies research activities PDF icon 2013_amr_12.pdf More Documents & Publications 2012 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2014 Annual Merit Review Results Report - Project and Program Statistical
Random paths and current fluctuations in nonequilibrium statistical mechanics
Gaspard, Pierre
2014-07-15
An overview is given of recent advances in nonequilibrium statistical mechanics about the statistics of random paths and current fluctuations. Although statistics is carried out in space for equilibrium statistical mechanics, statistics is considered in time or spacetime for nonequilibrium systems. In this approach, relationships have been established between nonequilibrium properties such as the transport coefficients, the thermodynamic entropy production, or the affinities, and quantities characterizing the microscopic Hamiltonian dynamics and the chaos or fluctuations it may generate. This overview presents results for classical systems in the escape-rate formalism, stochastic processes, and open quantum systems.
International petroleum statistics report, April 1998
1998-04-01
The International Petroleum Statistics Report presents data on International oil production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, May 1998
1998-05-01
The International Petroleum Statistics report is a monthly publication that provides current international oil data. It presents data on international production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two year. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997, and OECD trade from 1987 through 1997. 4 fig., 48 tabs.
International petroleum statistics report, August 1994
Not Available
1994-08-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, May 1995
1995-05-30
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1983 through 1993.
International petroleum statistics report, March 1995
1995-03-30
The International Petroleum Statistics Report presents data for March 1995 on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, October 1993
Not Available
1993-10-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1982 through 1992.
International petroleum statistics report, December 1993
Not Available
1993-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992. 41 tabs.
International petroleum statistics report, April 1994
Not Available
1994-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1982 through 1992. 41 tables.
International petroleum statistics report, September 1994
Not Available
1994-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (ECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, February 1994
Not Available
1994-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, November 1994
Not Available
1994-11-25
Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The International production, and on oil and stocks. The report has four sections. Section 1 contains time series data on world oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
Statistical Hot Channel Analysis for the NBSR
Cuadra A.; Baek J.
2014-05-27
A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.
International petroleum statistics report, April 1999
1999-05-04
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance fore the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, August 1998
1998-08-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, February 1996
1996-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report, July 1998
1998-07-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, December 1998
1998-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, May 1999
1999-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 48 tabs.
International petroleum statistics report, October 1998
1998-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, December 1997
1997-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. The balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, September 1998
1998-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, June 1998
1998-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, April 1997
1997-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, June 1997
1997-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 46 tabs.
International petroleum statistics report, March 1999
1999-03-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarter data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, September 1996
1996-09-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
International petroleum statistics report, February 1998
1998-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
International petroleum statistics report, June 1999
1999-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 46 tabs.
International petroleum statistics report, February 1997
1997-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, March 1994
1994-03-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, January 1999
1999-01-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, February 1999
1999-02-01
The International petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970--1997; OECD stocks from 1973--1997; and OECD trade from 1987--1997.
International petroleum statistics report, November 1998
1998-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, September 1995
1995-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994. 4 figs., 45 tabs.
A Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
____________________________________ 2013-01-2400 Published 09/24/2013 doi:10.4271/2013-01-2400 saecomveh.saejournals.org A Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems Adam Duran and Kevin Walkowicz National Renewable Energy Laboratory ABSTRACT In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating
Statistics of particle time-temperature histories.
Hewson, John C.; Lignell, David O.; Sun, Guangyuan
2014-10-01
Particles in non - isothermal turbulent flow are subject to a stochastic environment tha t produces a distribution of particle time - temperature histories. This distribution is a function of the dispersion of the non - isothermal (continuous) gas phase and the distribution of particles relative to that gas phase. In this work we extend the one - dimensional turbulence (ODT) model to predict the joint dispersion of a dispersed particle phase and a continuous phase. The ODT model predicts the turbulent evolution of continuous scalar fields with a model for the cascade of fluctuations to smaller sc ales (the 'triplet map') at a rate that is a function of the fully resolved one - dimens ional velocity field . Stochastic triplet maps also drive Lagrangian particle dispersion with finite Stokes number s including inertial and eddy trajectory - crossing effect s included. Two distinct approaches to this coupling between triplet maps and particle dispersion are developed and implemented along with a hybrid approach. An 'instantaneous' particle displacement model matches the tracer particle limit and provide s an accurate description of particle dispersion. A 'continuous' particle displacement m odel translates triplet maps into a continuous velocity field to which particles respond. Particles can alter the turbulence, and modifications to the stochastic rate expr ession are developed for two - way coupling between particles and the continuous phase. Each aspect of model development is evaluated in canonical flows (homogeneous turbulence, free - shear flows and wall - bounded flows) for which quality measurements are ava ilable. ODT simulations of non - isothermal flows provide statistics for particle heating. These simulations show the significance of accurately predicting the joint statistics of particle and fluid dispersion . Inhomogeneous turbulence coupled with the in fluence of the mean flow fields on particles of varying properties alter s particle dispersion. The joint particle - temperature dispersion leads to a distribution of temperature histories predicted by the ODT . Predictions are shown for the lower moments an d the full distributions of the particle positions, particle - observed gas temperatures and particle temperatures. An analysis of the time scales affecting particle - temperature interactions covers Lagrangian integral time scales based on temperature autoco rrelations, rates of temperature change associated with particle motion relative to the temperature field and rates of diffusional change of temperatures. These latter two time scales have not been investigated previously; they are shown to be strongly in termittent having peaked distributions with long tails. The logarithm of the absolute value of these time scales exhibits a distribution closer to normal. A cknowledgements This work is supported by the Defense Threat Reduction Agency (DTRA) under their Counter - Weapons of Mass Destruction Basic Research Program in the area of Chemical and Biological Agent Defeat under award number HDTRA1 - 11 - 4503I to Sandia National Laboratories. The authors would like to express their appreciation for the guidance provi ded by Dr. Suhithi Peiris to this project and to the Science to Defeat Weapons of Mass Destruction program.
Statistical theory of turbulent incompressible multimaterial flow
Kashiwa, B.
1987-10-01
Interpenetrating motion of incompressible materials is considered. ''Turbulence'' is defined as any deviation from the mean motion. Accordingly a nominally stationary fluid will exhibit turbulent fluctuations due to a single, slowly moving sphere. Mean conservation equations for interpenetrating materials in arbitrary proportions are derived using an ensemble averaging procedure, beginning with the exact equations of motion. The result is a set of conservation equations for the mean mass, momentum and fluctuational kinetic energy of each material. The equation system is at first unclosed due to integral terms involving unknown one-point and two-point probability distribution functions. In the mean momentum equation, the unclosed terms are clearly identified as representing two physical processes. One is transport of momentum by multimaterial Reynolds stresses, and the other is momentum exchange due to pressure fluctuations and viscous stress at material interfaces. Closure is approached by combining careful examination of multipoint statistical correlations with the traditional physical technique of kappa-epsilon modeling for single-material turbulence. This involves representing the multimaterial Reynolds stress for each material as a turbulent viscosity times the rate of strain based on the mean velocity of that material. The multimaterial turbulent viscosity is related to the fluctuational kinetic energy kappa, and the rate of fluctuational energy dissipation epsilon, for each material. Hence a set of kappa and epsilon equations must be solved, together with mean mass and momentum conservation equations, for each material. Both kappa and the turbulent viscosities enter into the momentum exchange force. The theory is applied to (a) calculation of the drag force on a sphere fixed in a uniform flow, (b) calculation of the settling rate in a suspension and (c) calculation of velocity profiles in the pneumatic transport of solid particles in a pipe.
Design and performance of a scalable, parallel statistics toolkit.
Thompson, David C.; Bennett, Janine Camille; Pebay, Philippe Pierre
2010-11-01
Most statistical software packages implement a broad range of techniques but do so in an ad hoc fashion, leaving users who do not have a broad knowledge of statistics at a disadvantage since they may not understand all the implications of a given analysis or how to test the validity of results. These packages are also largely serial in nature, or target multicore architectures instead of distributed-memory systems, or provide only a small number of statistics in parallel. This paper surveys a collection of parallel implementations of statistics algorithm developed as part of a common framework over the last 3 years. The framework strategically groups modeling techniques with associated verification and validation techniques to make the underlying assumptions of the statistics more clear. Furthermore it employs a design pattern specifically targeted for distributed-memory parallelism, where architectural advances in large-scale high-performance computing have been focused. Moment-based statistics (which include descriptive, correlative, and multicorrelative statistics, principal component analysis (PCA), and k-means statistics) scale nearly linearly with the data set size and number of processes. Entropy-based statistics (which include order and contingency statistics) do not scale well when the data in question is continuous or quasi-diffuse but do scale well when the data is discrete and compact. We confirm and extend our earlier results by now establishing near-optimal scalability with up to 10,000 processes.
A Statistical Framework for Microbial Source Attribution
Velsko, S P; Allen, J E; Cunningham, C T
2009-04-28
This report presents a general approach to inferring transmission and source relationships among microbial isolates from their genetic sequences. The outbreak transmission graph (also called the transmission tree or transmission network) is the fundamental structure which determines the statistical distributions relevant to source attribution. The nodes of this graph are infected individuals or aggregated sub-populations of individuals in which transmitted bacteria or viruses undergo clonal expansion, leading to a genetically heterogeneous population. Each edge of the graph represents a transmission event in which one or a small number of bacteria or virions infects another node thus increasing the size of the transmission network. Recombination and re-assortment events originate in nodes which are common to two distinct networks. In order to calculate the probability that one node was infected by another, given the observed genetic sequences of microbial isolates sampled from them, we require two fundamental probability distributions. The first is the probability of obtaining the observed mutational differences between two isolates given that they are separated by M steps in a transmission network. The second is the probability that two nodes sampled randomly from an outbreak transmission network are separated by M transmission events. We show how these distributions can be obtained from the genetic sequences of isolates obtained by sampling from past outbreaks combined with data from contact tracing studies. Realistic examples are drawn from the SARS outbreak of 2003, the FMDV outbreak in Great Britain in 2001, and HIV transmission cases. The likelihood estimators derived in this report, and the underlying probability distribution functions required to calculate them possess certain compelling general properties in the context of microbial forensics. These include the ability to quantify the significance of a sequence 'match' or 'mismatch' between two isolates; the ability to capture non-intuitive effects of network structure on inferential power, including the 'small world' effect; the insensitivity of inferences to uncertainties in the underlying distributions; and the concept of rescaling, i.e. ability to collapse sub-networks into single nodes and examine transmission inferences on the rescaled network.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS
Office of Scientific and Technical Information (OSTI)
(Technical Report) | SciTech Connect Technical Report: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Citation Details In-Document Search Title: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS The research under this project focused on a theoretical and computational modeling of dislocation dynamics of mesoscale deformation of metal single crystals. Specifically, the work aimed to implement a continuum statistical theory of dislocations to understand
Nonlinearity sensing via photon-statistics excitation spectroscopy
Assmann, Marc; Bayer, Manfred
2011-11-15
We propose photon-statistics excitation spectroscopy as an adequate tool to describe the optical response of a nonlinear system. To this end we suggest to use optical excitation with varying photon statistics as another spectroscopic degree of freedom to gather information about the system in question. The responses of several simple model systems to excitation beams with different photon statistics are discussed. Possible spectroscopic applications in terms of identifying lasing operation are pointed out.
Angular-momentum nonclassicality by breaking classical bounds on statistics
Luis, Alfredo; Rivas, Angel
2011-10-15
We derive simple practical procedures revealing the quantum behavior of angular momentum variables by the violation of classical upper bounds on the statistics. Data analysis is minimum and definite conclusions are obtained without evaluation of moments, or any other more sophisticated procedures. These nonclassical tests are very general and independent of other typical quantum signatures of nonclassical behavior such as sub-Poissonian statistics, squeezing, or oscillatory statistics, being insensitive to the nonclassical behavior displayed by other variables.
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS
Office of Scientific and Technical Information (OSTI)
(Technical Report) | SciTech Connect Technical Report: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Citation Details In-Document Search Title: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS The research under this project focused on a theoretical and computational modeling of dislocation dynamics of mesoscale deformation of metal single crystals. Specifically, the work aimed to implement a continuum statistical theory of dislocations to understand
Multivariate Statistical Analysis of Water Chemistry in Evaluating the
Office of Environmental Management (EM)
Origin of Contamination in Many Devils Wash, Shiprock, New Mexico | Department of Energy Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New
UNECE-Annual Bulletin of Transport Statistics for Europe and...
Data covers Europe, Canada and the United States. This is a trilingual publication in English, French and Russian." "This annual publication presents statistics and brief studies...
Autocorrelation Function Statistics and Implication to Decay Ratio Estimation
March-Leuba, Jose A.
2016-01-01
This document summarizes the results of a series of computer simulations to attempt to identify the statistics of the autocorrelation function, and implications for decay ratio estimation.
Fact #602: December 21, 2009 Freight Statistics by Mode, 2007...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Note: NA not available due to high sampling variability or poor response quality. Air ... Statistics, "U.S. Freight on the Move: Survey Preliminary Data," SR-018, September
WHO Statistical Information System (WHOSIS) | Open Energy Information
Classification of Diseases (ICD-10), International Classification of Impairments, Disabilities and Handicaps (ICIDH) Links to other sources of health-related statistical...
Experimental and Statistical Comparison of Engine Response as...
Office of Scientific and Technical Information (OSTI)
Experimental and Statistical Comparison of Engine Response as a Function of Fuel Chemistry ... Engine Response as a Function of Fuel Chemistry and Properties in CI and HCCI Engines ...
International Monetary Fund-Data and Statistics | Open Energy...
"The IMF publishes a range of time series data on IMF lending, exchange rates and other economic and financial indicators. Manuals, guides, and other material on statistical...
IRF-World Road Statistics | Open Energy Information
AgencyCompany Organization: International Road Statistics Focus Area: Transportation, Economic Development Resource Type: Dataset Website: www.irfnet.orgstatistics.php Cost:...
Random-matrix approach to the statistical compound nuclear reaction...
Office of Scientific and Technical Information (OSTI)
nuclear reaction at low energies using the Monte-Carlo technique Citation Details In-Document Search Title: Random-matrix approach to the statistical compound nuclear ...
BP Statistical Review of World Energy | Open Energy Information
OpenEI The BP Statistical Review of World Energy is an Excel spreadsheet which contains consumption and production data for Coal, Natural Gas, Nuclear, Oil, and Hydroelectric...
Statistical surrogate models for prediction of high-consequence...
Office of Scientific and Technical Information (OSTI)
We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest. A SSM is ...
Testing Statistical Cloud Scheme Ideas in the GFDL Climate Model
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Testing Statistical Cloud Scheme Ideas in the GFDL Climate Model Klein, Stephen Lawrence Livermore National Laboratory Pincus, Robert NOAA-CIRES Climate Diagnostics Center...
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS...
Office of Scientific and Technical Information (OSTI)
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Anter El-Azab 36 MATERIALS SCIENCE dislocation dynamics; mesoscale deformation of metals; crystal mechanics...
Key World Energy Statistics-2010 | Open Energy Information
World Energy Statistics-2010 AgencyCompany Organization: International Energy Agency Sector: Energy Topics: Market analysis Resource Type: Dataset, Maps Website: www.iea.org...
Evaluation of cirrus statistics produced by general circulation...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
cirrus statistics produced by general circulation models using ARM data Hartsock, Daniel University of Utah Mace, Gerald University of Utah Benson, Sally University of Utah...
UN-Glossary for Transportation Statistics | Open Energy Information
Publications Website: www.internationaltransportforum.orgPubpdfGloStat3e.pdf Cost: Free UN-Glossary for Transportation Statistics Screenshot References: UN-Glossary for...
Infinite statistics condensate as a model of dark matter
Ebadi, Zahra; Mirza, Behrouz; Mohammadzadeh, Hosein E-mail: b.mirza@cc.iut.ac.ir
2013-11-01
In some models, dark matter is considered as a condensate bosonic system. In this paper, we prove that condensation is also possible for particles that obey infinite statistics and derive the critical condensation temperature. We argue that a condensed state of a gas of very weakly interacting particles obeying infinite statistics could be considered as a consistent model of dark matter.
Sub-Poissonian statistics in order-to-chaos transition
Kryuchkyan, Gagik Yu. [Yerevan State University, Manookyan 1, Yerevan 375049, (Armenia); Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia); Manvelyan, Suren B. [Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia)
2003-07-01
We study the phenomena at the overlap of quantum chaos and nonclassical statistics for the time-dependent model of nonlinear oscillator. It is shown in the framework of Mandel Q parameter and Wigner function that the statistics of oscillatory excitation numbers is drastically changed in the order-to-chaos transition. The essential improvement of sub-Poissonian statistics in comparison with an analogous one for the standard model of driven anharmonic oscillator is observed for the regular operational regime. It is shown that in the chaotic regime, the system exhibits the range of sub-Poissonian and super-Poissonian statistics which alternate one to other depending on time intervals. Unusual dependence of the variance of oscillatory number on the external noise level for the chaotic dynamics is observed. The scaling invariance of the quantum statistics is demonstrated and its relation to dissipation and decoherence is studied.
Statistical mechanics based on fractional classical and quantum mechanics
Korichi, Z.; Meftah, M. T.
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Techniques in teaching statistics : linking research production and research use.
Martinez-Moyano, I .; Smith, A.
2012-01-01
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.
Statistical anisotropies in gravitational waves in solid inflation
Akhshik, Mohammad; Emami, Razieh; Firouzjahi, Hassan; Wang, Yi E-mail: emami@ipm.ir E-mail: yw366@cam.ac.uk
2014-09-01
Solid inflation can support a long period of anisotropic inflation. We calculate the statistical anisotropies in the scalar and tensor power spectra and their cross-correlation in anisotropic solid inflation. The tensor-scalar cross-correlation can either be positive or negative, which impacts the statistical anisotropies of the TT and TB spectra in CMB map more significantly compared with the tensor self-correlation. The tensor power spectrum contains potentially comparable contributions from quadrupole and octopole angular patterns, which is different from the power spectra of scalar, the cross-correlation or the scalar bispectrum, where the quadrupole type statistical anisotropy dominates over octopole.
A Divergence Statistics Extension to VTK for Performance Analysis.
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-02-01
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical...
Office of Scientific and Technical Information (OSTI)
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical Lens Sample from the Fifth Data Release Citation Details In-Document Search Title: The Sloan Digital Sky Survey...
UN-Energy Statistics Database | Open Energy Information
PV, Wind Resource Type: Dataset Website: data.un.orgExplorer.aspx?dEDATA Cost: Free Language: English UN-Energy Statistics Database Screenshot References: UN Data1 "The United...
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Table B-1: Analytical Results Statistical Mean Upper Confidence
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
B-1: Analytical Results Statistical Mean Upper Confidence Limit Statistical Mean Upper Confidence Limit TCLP Metals: TCLP Semivolatiles: Arsenic o-Cresol Barium p-Cresol Cadmium m-Cresol Chromium Cresol Lead 2,4-Dinitrotoluene Mercury Hexachlorobenzene Selenium Hexachlorobutadiene Silver Nitrobenzene TCLP Volatiles Pentachlorophenol Benzene 2,4,5-Trichlorophenol Carbon Tetrachloride 2,4,6-Trichlorophenol Chlorobenzene Hexachloroethane Chloroform TCLP Pesticides and Herbicides:
TITLE V-CONFIDENTIAL INFORMATION PROTECTION AND STATISTICAL EFFI-
Gasoline and Diesel Fuel Update (EIA)
6 STAT. 2962 PUBLIC LAW 107-347-DEC. 17 2002 TITLE V-CONFIDENTIAL INFORMATION PROTECTION AND STATISTICAL EFFI- CIENCY SEC. 501. SHORT TITLE. This title may be cited as the ''Confidential Information Protec- tion and Statistical Efficiency Act of 2002''. SEC. 502. DEFINITIONS. As used in this title: (1) The term ''agency'' means any entity that falls within the definition of the term ''executive agency'' as defined in section 102 of title 31, United States Code, or ''agency'', as defined in
Physics-based statistical learning approach to mesoscopic model selection
Office of Scientific and Technical Information (OSTI)
(Journal Article) | SciTech Connect Physics-based statistical learning approach to mesoscopic model selection Citation Details In-Document Search This content will become publicly available on November 8, 2016 Title: Physics-based statistical learning approach to mesoscopic model selection Authors: Taverniers, SÃ¸ren ; Haut, Terry S. ; Barros, Kipton ; Alexander, Francis J. ; Lookman, Turab Publication Date: 2015-11-09 OSTI Identifier: 1225546 Grant/Contract Number: AC52-06NA25396;
Lightweight and Statistical Techniques for Petascale Debugging: Correctness
Office of Scientific and Technical Information (OSTI)
on Petascale Systems (CoPS) Preliminry Report (Technical Report) | SciTech Connect Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Citation Details In-Document Search Title: Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific
Lightweight and Statistical Techniques for Petascale Debugging: Correctness
Office of Scientific and Technical Information (OSTI)
on Petascale Systems (CoPS) Preliminry Report (Technical Report) | SciTech Connect Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Citation Details In-Document Search Title: Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Ã— You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office
Mathematical and Statistical Opportunities in Cyber Security (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Mathematical and Statistical Opportunities in Cyber Security Citation Details In-Document Search Title: Mathematical and Statistical Opportunities in Cyber Security The role of mathematics in a complex system such as the Internet has yet to be deeply explored. In this paper, we summarize some of the important and pressing problems in cyber security from the viewpoint of open science environments. We start by posing the question 'What fundamental problems exist
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. Ã— You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize
Final Report on Statistical Debugging for Petascale Environments (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Final Report on Statistical Debugging for Petascale Environments Citation Details In-Document Search Title: Final Report on Statistical Debugging for Petascale Environments Ã— You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize additional information resources in energy science and
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect Technical Report: An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an estimation; an
Resistive switching phenomena: A review of statistical physics approaches
Office of Scientific and Technical Information (OSTI)
(Journal Article) | DOE PAGES Resistive switching phenomena: A review of statistical physics approaches This content will become publicly available on August 31, 2016 Â« Prev Next Â» Title: Resistive switching phenomena: A review of statistical physics approaches Authors: Lee, Jae Sung [1] ; Lee, Shinbuhm [2] ; Noh, Tae Won [3] + Show Author Affiliations School of Physics, Korea Institute for Advanced Study, Seoul 130-722, South Korea Materials Science and Technology Division, Oak Ridge
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE
Office of Scientific and Technical Information (OSTI)
RELIABILITY IMPROVEMENTS 2004 TO 2014 (Conference) | SciTech Connect STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Authors: Harris, S. ; Gross, R. ; Watson, H. Publication Date: 2015-02-04 OSTI Identifier: 1209039 Report Number(s): SRNL-STI-2015-00047
Statistical Surrogate Models for Estimating Probability of High-Consequence
Office of Scientific and Technical Information (OSTI)
Climate Change. (Conference) | SciTech Connect Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change. Citation Details In-Document Search Title: Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change. Abstract not provided. Authors: Field, Richard V., ; Boslough, Mark B. E. ; Constantine, Paul Publication Date: 2011-10-01 OSTI Identifier: 1106521 Report Number(s): SAND2011-8231C 465067 DOE Contract Number:
Statistical surrogate models for prediction of high-consequence climate
Office of Scientific and Technical Information (OSTI)
change. (Technical Report) | SciTech Connect Technical Report: Statistical surrogate models for prediction of high-consequence climate change. Citation Details In-Document Search Title: Statistical surrogate models for prediction of high-consequence climate change. In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on
Statistical and Domain Analytics Applied to PV Module Lifetime and
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Degradation Science | Department of Energy Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Presented at the PV Module Reliability Workshop, February 26 - 27 2013, Golden, Colorado PDF icon pvmrw13_ps2_casewestern_bruckman.pdf More Documents & Publications Literature Review of the Effects of UV Exposure on PV Modules Failure Rates from Certification Testing to UL
Non-gaussian mode coupling and the statistical cosmological principle
LoVerde, Marilena; Nelson, Elliot; Shandera, Sarah E-mail: eln121@psu.edu
2013-06-01
Local-type primordial non-Gaussianity couples statistics of the curvature perturbation ? on vastly different physical scales. Because of this coupling, statistics (i.e. the polyspectra) of ? in our Hubble volume may not be representative of those in the larger universe — that is, they may be biased. The bias depends on the local background value of ?, which includes contributions from all modes with wavelength k?
Quality control and statistical process control for nuclear analytical measurements
Seymour, R.; Sergent, F.; Clark, W.H.C.; Gleason, G.
1993-12-31
The same driving forces that are making businesses examine quality control of manufacturing processes are making laboratories reevaluate their quality control programs. Increased regulation (accountability), global competitiveness (profitability), and potential for litigation (defensibility) are the principal driving forces behind the development and implementation of QA/QC programs in the nuclear analytical laboratory. Both manufacturing and scientific quality control can use identical statistical methods, albeit with some differences in the treatment of the measured data. Today, the approaches to QC programs are quite different for most analytical laboratories as compared with manufacturing sciences. This is unfortunate because the statistical process control methods are directly applicable to measurement processes. It is shown that statistical process control methods can provide many benefits for laboratory QC data treatment.
Statistics and Discoveries at the LHC (4/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (1/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (3/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Rhapsody: I. Structural Properties and Formation History from a Statistical
Office of Scientific and Technical Information (OSTI)
Sample of Re-simulated Cluster-size Halos (Journal Article) | SciTech Connect Rhapsody: I. Structural Properties and Formation History from a Statistical Sample of Re-simulated Cluster-size Halos Citation Details In-Document Search Title: Rhapsody: I. Structural Properties and Formation History from a Statistical Sample of Re-simulated Cluster-size Halos Authors: Wu, Hao-Yi ; /KIPAC, Menlo Park /SLAC /Michigan U. ; Hahn, Oliver ; Wechsler, Risa H. ; Mao, Yao-Yuan ; Behroozi, Peter S. ;
Office of Oil, Gas, and Coal Supply Statistics
Gasoline and Diesel Fuel Update (EIA)
Office of Oil, Gas, and Coal Supply Statistics www.eia.gov Natural Gas Monthly February 2016 U.S. Department of Energy Washington, DC 20585 February 2016 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
3 Office of Oil, Gas, and Coal Supply Statistics www.eia.gov Natural Gas Annual 2014 U.S. Department of Energy Washington, DC 20585 2014 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P. (Livermore, CA); Brandt, James M. (Dublin, CA); Gentile, Ann C. (Dublin, CA); Marzouk, Youssef M. (Oakland, CA); Hale, Darrian J. (San Jose, CA); Thompson, David C. (Livermore, CA)
2011-01-25
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P. (Livermore, CA); Brandt, James M. (Dublin, CA); Gentile, Ann C. (Dublin, CA); Marzouk, Youssef M. (Oakland, CA); Hale, Darrian J. (San Jose, CA); Thompson, David C. (Livermore, CA)
2011-01-04
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P. (Livermore, CA); Brandt, James M. (Dublin, CA), Gentile; Ann C. (Dublin, CA), Marzouk; Youssef M. (Oakland, CA), Hale; Darrian J. (San Jose, CA), Thompson; David C. (Livermore, CA)
2010-07-13
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
Financial statistics of major publicly owned electric utilities, 1991
Not Available
1993-03-31
The Financial Statistics of Major Publicly Owned Electric Utilities publication presents summary and detailed financial accounting data on the publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with data that can be used for policymaking and decisionmaking purposes relating to publicly owned electric utility issues.
Feature-Based Statistical Analysis of Combustion Simulation Data
Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T
2011-11-18
We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemoreÂ Â» rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Î¦) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Î¦ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.Â«Â less
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, we rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (?) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of ? and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.
User Statistics Collection Practices | U.S. DOE Office of Science...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
User Statistics Collection Practices User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation...
Starkov, V. N.; Semenov, A. A.; Gomonay, H. V.
2009-07-15
We demonstrate a practical possibility of loss compensation in measured photocounting statistics in the presence of dark counts and background radiation noise. It is shown that satisfactory results are obtained even in the case of low detection efficiency and large experimental errors.
Monthly/Annual Energy Review - renewable section
Reports and Publications (EIA)
2015-01-01
Monthly and latest annual statistics on renewable energy production and consumption and overviews of fuel ethanol and biodiesel.
Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Martz, A; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'
Summary Statistics for Fun Dough Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a Play Dough{trademark}-like product, Fun Dough{trademark}, designated as PD. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2100 LMHU{sub D} at 100kVp to a low of about 1100 LMHU{sub D} at 300kVp. The standard deviation of each measurement is around 1% of the mean. The entropy covers the range from 3.9 to 4.6. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 8.5. LLNL prepared about 50mL of the Fun Dough{trademark} in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. Still, layers can plainly be seen in the reconstructed images, indicating that the bulk density of the material in the container is affected by voids and bubbles. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'
Complex statistics and diffusion in nonlinear disordered particle chains
Antonopoulos, Ch. G.; Bountis, T.; Skokos, Ch.; Drossos, L.
2014-06-15
We investigate dynamically and statistically diffusive motion in a Klein-Gordon particle chain in the presence of disorder. In particular, we examine a low energy (subdiffusive) and a higher energy (self-trapping) case and verify that subdiffusive spreading is always observed. We then carry out a statistical analysis of the motion, in both cases, in the sense of the Central Limit Theorem and present evidence of different chaos behaviors, for various groups of particles. Integrating the equations of motion for times as long as 10{sup 9}, our probability distribution functions always tend to Gaussians and show that the dynamics does not relax onto a quasi-periodic Kolmogorov-Arnold-Moser torus and that diffusion continues to spread chaotically for arbitrarily long times.
Lifetime statistics of quantum chaos studied by a multiscale analysis
Di Falco, A.; Krauss, T. F. [School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Fratalocchi, A. [PRIMALIGHT, Faculty of Electrical Engineering, Applied Mathematics and Computational Science, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900 (Saudi Arabia)
2012-04-30
In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.
Spatial statistics for predicting flow through a rock fracture
Coakley, K.J.
1989-03-01
Fluid flow through a single rock fracture depends on the shape of the space between the upper and lower pieces of rock which define the fracture. In this thesis, the normalized flow through a fracture, i.e. the equivalent permeability of a fracture, is predicted in terms of spatial statistics computed from the arrangement of voids, i.e. open spaces, and contact areas within the fracture. Patterns of voids and contact areas, with complexity typical of experimental data, are simulated by clipping a correlated Gaussian process defined on a N by N pixel square region. The voids have constant aperture; the distance between the upper and lower surfaces which define the fracture is either zero or a constant. Local flow is assumed to be proportional to local aperture cubed times local pressure gradient. The flow through a pattern of voids and contact areas is solved using a finite-difference method. After solving for the flow through simulated 10 by 10 by 30 pixel patterns of voids and contact areas, a model to predict equivalent permeability is developed. The first model is for patterns with 80% voids where all voids have the same aperture. The equivalent permeability of a pattern is predicted in terms of spatial statistics computed from the arrangement of voids and contact areas within the pattern. Four spatial statistics are examined. The change point statistic measures how often adjacent pixel alternate from void to contact area (or vice versa ) in the rows of the patterns which are parallel to the overall flow direction. 37 refs., 66 figs., 41 tabs.
ARSCL Cloud Statistics - A Value-Added Product
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ARSCL Cloud Statistics - A Value-Added Product Y. Shi Pacific Northwest National Laboratory Richland, Washington M. A. Miller Brookhaven National Laboratory Upton, New York Introduction The active remote sensing of cloud layers (ARSCLs) value-added product (VAP) combines data from active remote sensors to produce an objective determination of cloud location, radar reflectivity, vertical velocity, and Doppler spectral width. Information about the liquid water path (LWP) in these clouds and the
Doppler Lidar Vertical Velocity Statistics Value-Added Product
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
49 Doppler Lidar Vertical Velocity Statistics Value-Added Product RK Newsom C Sivaraman TR Shippert LD Riihimaki July 2015 DISCLAIMER This report was prepared as an account of work sponsored by the U.S. Government. Neither the United States nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or
Structure Learning and Statistical Estimation in Distribution Networks -
Office of Scientific and Technical Information (OSTI)
Part I (Technical Report) | SciTech Connect I Citation Details In-Document Search Title: Structure Learning and Statistical Estimation in Distribution Networks - Part I Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by
Structure Learning and Statistical Estimation in Distribution Networks -
Office of Scientific and Technical Information (OSTI)
Part II (Technical Report) | SciTech Connect II Citation Details In-Document Search Title: Structure Learning and Statistical Estimation in Distribution Networks - Part II Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements.
MISR-Derived Statistics of Cumulus Geometry at TWP Site
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
MISR-Derived Statistics of Cumulus Geometry at TWP Site E. I. Kassianov, T. P. Ackerman, and R. T. Marchand Pacific Northwest National Laboratory Richland, Washington Introduction The multi-angle imaging spectro radiometer (MISR), recently launched on the National Aeronautics and Space Administration (NASA) Terra platform, provides high-resolution measurements of reflectance at nine different viewing angles. Multi-angle satellite observations have been successfully used to derive the cloud
Financial statistics of selected investor-owned electric utilities, 1989
Not Available
1991-01-01
The Financial Statistics of Selected Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide the Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to investor-owned electric utility issues.
U.S. Department of Commerce Economics and Statistics Administration
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Commerce Economics and Statistics Administration 48% 24% 52% 76% 0% 20% 40% 60% 80% 100% All jobs STEM jobs Men Women By David Beede, Tiffany Julian, David Langdon, George McKittrick, Beethika Khan, and Mark Doms, Office of the Chief Economist Women in STEM: A Gender Gap to Innovation August 2011 Executive Summary ESA Issue Brief #04-11 O ur science, technology, engineering and math (STEM) workforce is crucial to America's innovative capacity and global competitiveness. Yet women are vastly
Spectral statistics in noninteracting many-particle systems
Munoz, L.; Relano, A.; Retamosa, J. [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, E-28040 Madrid (Spain); Faleiro, E. [Departamento de Fisica Aplicada, E.U.I.T. Industrial, Universidad Politecnica de Madrid, E-28012 Madrid (Spain); Molina, R.A. [Max-Planck-Institut fuer Physik Komplexer Systeme, Noethnitzer Strasse 38, D-01187 Dresden (Germany)
2006-03-15
It is widely accepted that the statistical properties of energy level spectra provide an essential characterization of quantum chaos. Indeed, the spectral fluctuations of many different systems like quantum billiards, atoms, or atomic nuclei have been studied. However, noninteracting many-body systems have received little attention, since it is assumed that they must exhibit Poisson-like fluctuations. Apart from a heuristic argument of Bloch, there are neither systematic numerical calculations nor a rigorous derivation of this fact. Here we present a rigorous study of the spectral fluctuations of noninteracting identical particles moving freely in a mean field emphasizing the evolution with the number of particles N as well as with the energy. Our results are conclusive. For N{>=}2 the spectra of these systems exhibit Poisson fluctuations provided that we consider sufficiently high excitation energies. Nevertheless, when the mean field is chaotic there exists a critical energy scale L{sub c}; beyond this scale, the fluctuations deviate from the Poisson statistics as a reminiscence of the statistical properties of the mean field.
Electron transfer statistics and thermal fluctuations in molecular junctions
Goswami, Himangshu Prabal; Harbola, Upendra
2015-02-28
We derive analytical expressions for probability distribution function (PDF) for electron transport in a simple model of quantum junction in presence of thermal fluctuations. Our approach is based on the large deviation theory combined with the generating function method. For large number of electrons transferred, the PDF is found to decay exponentially in the tails with different rates due to applied bias. This asymmetry in the PDF is related to the fluctuation theorem. Statistics of fluctuations are analyzed in terms of the Fano factor. Thermal fluctuations play a quantitative role in determining the statistics of electron transfer; they tend to suppress the average current while enhancing the fluctuations in particle transfer. This gives rise to both bunching and antibunching phenomena as determined by the Fano factor. The thermal fluctuations and shot noise compete with each other and determine the net (effective) statistics of particle transfer. Exact analytical expression is obtained for delay time distribution. The optimal values of the delay time between successive electron transfers can be lowered below the corresponding shot noise values by tuning the thermal effects.
User Statistics Collection Practices Archives | U.S. DOE Office of Science
Office of Science (SC) Website
(SC) Policies and Processes Â» User Statistics Collection Practices Â» User Statistics Collection Practices Archives User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Science Highlights Frequently Asked Questions User Facility News Contact Information Office of Science U.S.
EIA - Advice from Meetings of the ASA Committee on Energy Statistics
Gasoline and Diesel Fuel Update (EIA)
Advice from Meetings of the ASA Committee on Energy Statistics Transcripts and Summaries from the American Statistical Association Committee on Energy Statistics The U.S. Energy Information Administration seeks technical advice semi-annually from the American Statistical Association Committee on Energy Statistics. The meetings are held in the spring and fall in Washington, D.C., and are announced in the Federal Register. These meetings are open to the public and are typically held on Thursday
Advances on statistical/thermodynamical models for unpolarized structure functions
Trevisan, Luis A.; Mirez, Carlos; Tomio, Lauro
2013-03-25
During the eights and nineties many statistical/thermodynamical models were proposed to describe the nucleons' structure functions and distribution of the quarks in the hadrons. Most of these models describe the compound quarks and gluons inside the nucleon as a Fermi / Bose gas respectively, confined in a MIT bag with continuous energy levels. Another models considers discrete spectrum. Some interesting features of the nucleons are obtained by these models, like the sea asymmetries {sup -}d/{sup -}u and {sup -}d-{sup -}u.
Quantum Statistical Testing of a Quantum Random Number Generator
Humble, Travis S
2014-01-01
The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.
Statistical Software for spatial analysis of stratigraphic data sets
Energy Science and Technology Software Center (OSTI)
2003-04-08
Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation ofmoreÂ Â» techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive modelsÂ«Â less
Statistics at work in heavy-ion reactions
Moretto, L.G.
1982-07-01
In the first part special aspects of the compound nucleus decay are considered. The evaporation of particles intermediate between nucleons and fission fragments is explored both theoretically and experimentally. The limitations of the fission decay width expression obtained with the transition state method are discussed, and a more general approach is proposed. In the second part the process of angular momentum transfer in deep inelastic reactions is considered. The limit of statistical equilibrium is studied and specifically applied to the estimation of the degree of alignment of the fragment spins. The magnitude and alignment of the transferred angular momentum is experimentally determined from sequentially emitted alpha, gamma, and fission fragments.
The Fall Meeting of the Committee on Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
* * * * * FRIDAY NOVEMBER 5, 1999 The Fall Meeting of the Committee on Energy Statistics commenced at 8:30 a.m. at the Department of Energy, 1000 Independence Avenue, S.W., Room 8E089, Washington, D.C., Daniel Relles, presiding. PRESENT: DANIEL RELLES, Chairman JAY BREIDT LYNDA CARLSON THOMAS COWING CAROL GOTWAY CRAWFORD JAY HAKES JAMES HAMMITT PHILIP HANSER CALVIN KENT W. DAVID MONTGOMERY LARRY PETTIS SEYMOUR SUDMAN BILL WEINIG ROY WHITMORE C O N T E N T S PAGE Opening 5 Addressing Accuracy in
Statistical Methods Handbook for Advanced Gas Reactor Fuel Materials
J. J. Einerson
2005-05-01
Fuel materials such as kernels, coated particles, and compacts are being manufactured for experiments simulating service in the next generation of high temperature gas reactors. These must meet predefined acceptance specifications. Many tests are performed for quality assurance, and many of these correspond to criteria that must be met with specified confidence, based on random samples. This report describes the statistical methods to be used. The properties of the tests are discussed, including the risk of false acceptance, the risk of false rejection, and the assumption of normality. Methods for calculating sample sizes are also described.
Statistical Inference for Big Data Problems in Molecular Biophysics
Ramanathan, Arvind; Savol, Andrej; Burger, Virginia; Quinn, Shannon; Agarwal, Pratul K; Chennubhotla, Chakra
2012-01-01
We highlight the role of statistical inference techniques in providing biological insights from analyzing long time-scale molecular simulation data. Technologi- cal and algorithmic improvements in computation have brought molecular simu- lations to the forefront of techniques applied to investigating the basis of living systems. While these longer simulations, increasingly complex reaching petabyte scales presently, promise a detailed view into microscopic behavior, teasing out the important information has now become a true challenge on its own. Mining this data for important patterns is critical to automating therapeutic intervention discovery, improving protein design, and fundamentally understanding the mech- anistic basis of cellular homeostasis.
Statistical anisotropy of the curvature perturbation from vector field perturbations
Dimopoulos, Konstantinos; Karciauskas, Mindaugas; Lyth, David H.; Rodriguez, Yeinzon E-mail: m.karciauskas@lancaster.ac.uk E-mail: yeinzon.rodriguez@uan.edu.co
2009-05-15
The {delta}N formula for the primordial curvature perturbation {zeta} is extended to include vector as well as scalar fields. Formulas for the tree-level contributions to the spectrum and bispectrum of {zeta} are given, exhibiting statistical anisotropy. The one-loop contribution to the spectrum of {zeta} is also worked out. We then consider the generation of vector field perturbations from the vacuum, including the longitudinal component that will be present if there is no gauge invariance. Finally, the {delta}N formula is applied to the vector curvaton and vector inflation models with the tensor perturbation also evaluated in the latter case.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
3 U.S. Department of Energy Washington, DC 20585 2013 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as representing those of the
Poincaré recurrence statistics as an indicator of chaos synchronization
Boev, Yaroslav I. Vadivasova, Tatiana E. Anishchenko, Vadim S.
2014-06-15
The dynamics of the autonomous and non-autonomous Rössler system is studied using the Poincaré recurrence time statistics. It is shown that the probability distribution density of Poincaré recurrences represents a set of equidistant peaks with the distance that is equal to the oscillation period and the envelope obeys an exponential distribution. The dimension of the spatially uniform Rössler attractor is estimated using Poincaré recurrence times. The mean Poincaré recurrence time in the non-autonomous Rössler system is locked by the external frequency, and this enables us to detect the effect of phase-frequency synchronization.
Statistical thermodynamics of strain hardening in polycrystalline solids
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Langer, James S.
2015-09-18
This paper starts with a systematic rederivation of the statistical thermodynamic equations of motion for dislocation-mediated plasticity proposed in 2010 by Langer, Bouchbinder, and Lookman. The paper then uses that theory to explain the anomalous rate-hardening behavior reported in 1988 by Follansbee and Kocks and to explore the relation between hardening rate and grain size reported in 1995 by Meyers et al. A central theme is the need for physics-based, nonequilibrium analyses in developing predictive theories of the strength of polycrystalline materials.
Financial statistics of major US publicly owned electric utilities 1993
Not Available
1995-02-01
The 1993 edition of the Financial Statistics of Major U.S. Publicly Owned Electric Utilities publication presents five years (1989 to 1993) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. The primary source of publicly owned financial data is the Form EIA-412, the Annual Report of Public Electric Utilities, filed on a fiscal basis.
Younger Dryas Boundary (YDB) impact : physical and statistical
Office of Scientific and Technical Information (OSTI)
impossibility. (Conference) | SciTech Connect The YDB impact hypothesis of Firestone et al. (2007) is so extremely improbable it can be considered statistically impossible in addition to being physically impossible. Comets make up only about 1% of the population of Earth-crossing objects. Broken comets are a vanishingly small fraction, and only exist as Earth-sized clusters for a very short period of time. Only a small fraction of impacts occur at angles as shallow as proposed by the YDB
Image segmentation by hierarchial agglomeration of polygons using ecological statistics
Prasad, Lakshman; Swaminarayan, Sriram
2013-04-23
A method for rapid hierarchical image segmentation based on perceptually driven contour completion and scene statistics is disclosed. The method begins with an initial fine-scale segmentation of an image, such as obtained by perceptual completion of partial contours into polygonal regions using region-contour correspondences established by Delaunay triangulation of edge pixels as implemented in VISTA. The resulting polygons are analyzed with respect to their size and color/intensity distributions and the structural properties of their boundaries. Statistical estimates of granularity of size, similarity of color, texture, and saliency of intervening boundaries are computed and formulated into logical (Boolean) predicates. The combined satisfiability of these Boolean predicates by a pair of adjacent polygons at a given segmentation level qualifies them for merging into a larger polygon representing a coarser, larger-scale feature of the pixel image and collectively obtains the next level of polygonal segments in a hierarchy of fine-to-coarse segmentations. The iterative application of this process precipitates textured regions as polygons with highly convolved boundaries and helps distinguish them from objects which typically have more regular boundaries. The method yields a multiscale decomposition of an image into constituent features that enjoy a hierarchical relationship with features at finer and coarser scales. This provides a traversable graph structure from which feature content and context in terms of other features can be derived, aiding in automated image understanding tasks. The method disclosed is highly efficient and can be used to decompose and analyze large images.
View discovery in OLAP databases through statistical combinatorial optimization
Hengartner, Nick W; Burke, John; Critchlow, Terence; Joslyn, Cliff; Hogan, Emilie
2009-01-01
OnLine Analytical Processing (OLAP) is a relational database technology providing users with rapid access to summary, aggregated views of a single large database, and is widely recognized for knowledge representation and discovery in high-dimensional relational databases. OLAP technologies provide intuitive and graphical access to the massively complex set of possible summary views available in large relational (SQL) structured data repositories. The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of 'views' of an OLAP database as a combinatorial object of all projections and subsets, and 'view discovery' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline 'hop-chaining' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a 'spiraling' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.
Statistics of anisotropies in inflation with spectator vector fields
Thorsrud, Mikjel; Mota, David F.; Urban, Federico R. E-mail: furban@ulb.ac.be
2014-04-01
We study the statistics of the primordial power spectrum in models where massless gauge vectors are coupled to the inflaton, paying special attention to observational implications of having fundamental or effective horizons embedded in a bath of infrared fluctuations. As quantum infrared modes cross the horizon, they classicalize and build a background vector field. We find that the vector experiences a statistical precession phenomenon. Implications for primordial correlators and the interpretation thereof are considered. Firstly, we show how in general two, not only one, additional observables, a quadrupole amplitude and an intrinsic shape parameter, are necessary to fully describe the correction to the curvature power spectrum, and develop a unique parametrization for them. Secondly, we show that the observed anisotropic amplitude and the associated preferred direction depend on the volume of the patch being probed. We calculate non-zero priors for the expected deviations between detections based on microwave background data (which probes the entire Hubble patch) and large scale structure (which only probes a fraction of it)
Correlating sampling and intensity statistics in nanoparticle diffraction experiments
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Ã–ztÃ¼rk, Hande; Yan, Hanfei; Hill, John P.; Noyan, I. Cevdet
2015-07-28
It is shown in a previous article [Ã–ztÃ¼rk, Yan, Hill & Noyan (2014).J. Appl. Cryst.47, 1016â€“1025] that the sampling statistics of diffracting particle populations within a polycrystalline ensemble depended on the size of the constituent crystallites: broad X-ray peak breadths enabled some nano-sized particles to contribute more than one diffraction spot to Debyeâ€“Scherrer rings. Here it is shown that the equations proposed by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742â€“753] (AKK) to link diffracting particle and diffracted intensity statistics are not applicable if the constituent crystallites of the powder are below 10 nm. In this size range, (i) themoreÂ Â» one-to-one correspondence between diffracting particles and Laue spots assumed in the AKK analysis is not satisfied, and (ii) the crystallographic correlation between Laue spots originating from the same grain invalidates the assumption that all diffracting plane normals are randomly oriented and uncorrelated. Such correlation produces unexpected results in the selection of diffracting grains. For example, three or more Laue spots from a given grain for a particular reflection can only be observed at certain wavelengths. In addition, correcting the diffracted intensity values by the traditional Lorentz term, 1/cos Î¸, to compensate for the variation of particles sampled within a reflection band does not maintain fidelity to the number of poles contributing to the diffracted signal. A new term, cos Î¸B/cos Î¸, corrects this problem.Â«Â less
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.
n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator
Energy Science and Technology Software Center (OSTI)
2012-09-12
nSIGHTS (n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator) is a comprehensive well test analysis software package. It provides a user-interface, a well test analysis model and many tools to analyze both field and simulated data. The well test analysis model simulates a single-phase, one-dimensional, radial/non-radial flow regime, with a borehole at the center of the modeled flow system. nSIGHTS solves the radially symmetric n-dimensional forward flow problem using a solver based on a graph-theoretic approach.moreÂ Â» The results of the forward simulation are pressure, and flow rate, given all the input parameters. The parameter estimation portion of nSIGHTS uses a perturbation-based approach to interpret the best-fit well and reservoir parameters, given an observed dataset of pressure and flow rate.Â«Â less
Structure Learning and Statistical Estimation in Distribution Networks - Part II
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.
Statistical measures of Planck scale signal correlations in interferometers
Hogan, Craig J.; Kwon, Ohkyung
2015-06-22
A model-independent statistical framework is presented to interpret data from systems where the mean time derivative of positional cross correlation between world lines, a measure of spreading in a quantum geometrical wave function, is measured with a precision smaller than the Planck time. The framework provides a general way to constrain possible departures from perfect independence of classical world lines, associated with Planck scale bounds on positional information. A parametrized candidate set of possible correlation functions is shown to be consistent with the known causal structure of the classical geometry measured by an apparatus, and the holographic scaling of information suggested by gravity. Frequency-domain power spectra are derived that can be compared with interferometer data. As a result, simple projections of sensitivity for specific experimental set-ups suggests that measurements will directly yield constraints on a universal time derivative of the correlation function, and thereby confirm or rule out a class of Planck scale departures from classical geometry.
Statistical Stability and Time-Reversal Imgaing in Random Media
Berryman, J; Borcea, L; Papanicolaou, G; Tsogka, C
2002-02-05
Localization of targets imbedded in a heterogeneous background medium is a common problem in seismic, ultrasonic, and electromagnetic imaging problems. The best imaging techniques make direct use of the eigenfunctions and eigenvalues of the array response matrix, as recent work on time-reversal acoustics has shown. Of the various imaging functionals studied, one that is representative of a preferred class is a time-domain generalization of MUSIC (MUltiple Signal Classification), which is a well-known linear subspace method normally applied only in the frequency domain. Since statistical stability is not characteristic of the frequency domain, a transform back to the time domain after first diagonalizing the array data in the frequency domain takes optimum advantage of both the time-domain stability and the frequency-domain orthogonality of the relevant eigenfunctions.
Statistical properties of Charney-Hasegawa-Mima zonal flows
Anderson, Johan; Botha, G. J. J.
2015-05-15
A theoretical interpretation of numerically generated probability density functions (PDFs) of intermittent plasma transport events in unforced zonal flows is provided within the Charney-Hasegawa-Mima (CHM) model. The governing equation is solved numerically with various prescribed density gradients that are designed to produce different configurations of parallel and anti-parallel streams. Long-lasting vortices form whose flow is governed by the zonal streams. It is found that the numerically generated PDFs can be matched with analytical predictions of PDFs based on the instanton method by removing the autocorrelations from the time series. In many instances, the statistics generated by the CHM dynamics relaxes to Gaussian distributions for both the electrostatic and vorticity perturbations, whereas in areas with strong nonlinear interactions it is found that the PDFs are exponentially distributed.
Statistical Methods and Tools for Hanford Staged Feed Tank Sampling
Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.
2013-10-01
This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).
Statistical tools for prognostics and health management of complex systems
Collins, David H; Huzurbazar, Aparna V; Anderson - Cook, Christine M
2010-01-01
Prognostics and Health Management (PHM) is increasingly important for understanding and managing today's complex systems. These systems are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. We present background on PHM and a suite of applicable statistical tools and methods. Our primary focus is on predicting future states of the system (e.g., the probability of being operational at a future time, or the expected remaining system life) using heterogeneous data from a variety of sources. We discuss component reliability models incorporating physical understanding, condition measurements from sensors, and environmental covariates; system reliability models that allow prediction of system failure time distributions from component failure models; and the use of Bayesian techniques to incorporate expert judgments into component and system models.
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
Doppler Lidar Vertical Velocity Statistics Value-Added Product
Newsom, RK; Sivaraman, C; Shippert, TR; Riihimaki, LD
2015-07-01
fluctuations are crucial for improved understanding of turbulent mixing and diffusion, convective initiation, and cloud life cycles. The Atmospheric Radiation Measurement (ARM) Climate Research Facility operates coherent Doppler lidar systems at several sites around the globe. These instruments provide measurements of clear-air vertical velocity profiles in the lower troposphere with a nominal temporal resolution of 1 sec and height resolution of 30 m. The purpose of the Doppler lidar vertical velocity statistics (DLWSTATS) value-added product (VAP) is to produce height- and time-resolved estimates of vertical velocity variance, skewness, and kurtosis from these raw measurements. The VAP also produces estimates of cloud properties, including cloud-base height (CBH), cloud frequency, cloud-base vertical velocity, and cloud-base updraft fraction.
Predicting weak lensing statistics from halo mass reconstructions - Final Paper
Everett, Spencer
2015-08-20
As dark matter does not absorb or emit light, its distribution in the universe must be inferred through indirect effects such as the gravitational lensing of distant galaxies. While most sources are only weakly lensed, the systematic alignment of background galaxies around a foreground lens can constrain the mass of the lens which is largely in the form of dark matter. In this paper, I have implemented a framework to reconstruct all of the mass along lines of sight using a best-case dark matter halo model in which the halo mass is known. This framework is then used to make predictions of the weak lensing of 3,240 generated source galaxies through a 324 arcmin² field of the Millennium Simulation. The lensed source ellipticities are characterized by the ellipticity-ellipticity and galaxy-mass correlation functions and compared to the same statistic for the intrinsic and ray-traced ellipticities. In the ellipticity-ellipticity correlation function, I and that the framework systematically under predicts the shear power by an average factor of 2.2 and fails to capture correlation from dark matter structure at scales larger than 1 arcminute. The model predicted galaxy-mass correlation function is in agreement with the ray-traced statistic from scales 0.2 to 0.7 arcminutes, but systematically underpredicts shear power at scales larger than 0.7 arcminutes by an average factor of 1.2. Optimization of the framework code has reduced the mean CPU time per lensing prediction by 70% to 24 ± 5 ms. Physical and computational shortcomings of the framework are discussed, as well as potential improvements for upcoming work.
Spatial Statistical Procedures to Validate Input Data in Energy Models
Johannesson, G.; Stewart, J.; Barr, C.; Brady Sabeff, L.; George, R.; Heimiller, D.; Milbrandt, A.
2006-01-01
Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the abovementioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.
Detailed Monthly and Annual LNG Import Statistics (2004-2012) | Department
of Energy Detailed Monthly and Annual LNG Import Statistics (2004-2012) Detailed Monthly and Annual LNG Import Statistics (2004-2012) Detailed Monthly and Annual LNG Import Statistics (2004-2012) PDF icon Detailed Monthly and Annual LNG Import Statistics (2004-2012) More Documents & Publications U.S. LNG Imports and Exports (2004-2012) Natural Gas Imports and Exports Fourth Quarter Report 2013 LNG Safety Research Report to Congress
User Statistics Collection Practices | U.S. DOE Office of Science (SC)
Office of Science (SC) Website
User Statistics Collection Practices User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Science Highlights Frequently Asked Questions User Facility News Contact Information Office of Science U.S. Department of Energy 1000 Independence Ave., SW Washington, DC 20585 P: (202)
Statistical Analysis of Tank 5 Floor Sample Results
Shine, E. P.
2013-01-31
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their MDCs. The identification of distributions and the selection of UCL95 procedures generally followed the protocol in Singh, Armbya, and Singh [2010]. When all of an analyte's measurements lie below their MDCs, only a summary of the MDCs can be provided. The measurement results reported by SRNL are listed, and the results of this analysis are reported. The data were generally found to follow a normal distribution, and to be homogenous across composite samples.
Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields
John A. Krommes
2001-02-16
A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which provides a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations, stochasticity criteria for quasilinear theory, formal aspects of resonance-broadening theory, Novikov's theorem, the treatment of weak inhomogeneity, the derivation of the Vlasov weak-turbulence wave kinetic equation from a fully renormalized description, some features of a code for solving the direct-interaction approximation and related Markovian closures, the details of the solution of the EDQNM closure for a solvable three-wave model, and the notation used in the article.
User Experience Research and Statistics for the Web
Broader source: Energy.gov [DOE]
To improve its websites and applications, especially for new projects, the Office of Energy Efficiency and Renewable Energy (EERE) strongly recommends, but does not require, conducting user experience (UX) research.
Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems
Duran, A.; Walkowicz, K.
2013-10-01
In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating across the country. Employing a combination of Isaac Instruments GPS/CAN data loggers in conjunction with existing onboard telemetric systems resulted in the capture of operating information for more than 200 individual vehicles in three geographically unique domestic locations. In total, over 1,500 individual operational route shifts from Washington, New York, and Colorado were collected. Upon completing the collection of in-use field data using either NREL-installed data acquisition devices or existing onboard telemetry systems, large-scale duty-cycle statistical analyses were performed to examine underlying vehicle dynamics trends within the data and to explore vehicle operation variations between fleet locations. Based on the results of these analyses, high, low, and average vehicle dynamics requirements were determined, resulting in the selection of representative standard chassis dynamometer test cycles for each condition. In this paper, the methodology and accompanying results of the large-scale duty-cycle statistical analysis are presented, including graphical and tabular representations of a number of relationships between key duty-cycle metrics observed within the larger data set. In addition to presenting the results of this analysis, conclusions are drawn and presented regarding potential applications of advanced vehicle technology as it relates specifically to school buses.
Statistical Characterization of Medium-Duty Electric Vehicle Drive Cycles: Preprint
Prohaska, R.; Duran, A.; Ragatz, A.; Kelly, K.
2015-05-01
In an effort to help commercialize technologies for electric vehicles (EVs) through deployment and demonstration projects, the U.S. Department of Energyâ€™s (DOE's) American Recovery and Reinvestment Act (ARRA) provided funding to participating U.S. companies to cover part of the cost of purchasing new EVs. Within the medium- and heavy-duty commercial vehicle segment, both Smith Electric Newton and and Navistar eStar vehicles qualified for such funding opportunities. In an effort to evaluate the performance characteristics of the new technologies deployed in these vehicles operating under real world conditions, data from Smith Electric and Navistar medium-duty EVs were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team over a period of 3 years. More than 430 Smith Newton EVs have provided data representing more than 150,000 days of operation. Similarly, data have been collected from more than 100 Navistar eStar EVs, resulting in a comparative total of more than 16,000 operating days. Combined, NREL has analyzed more than 6 million kilometers of driving and 4 million hours of charging data collected from commercially operating medium-duty electric vehicles in various configurations. In this paper, extensive duty-cycle statistical analyses are performed to examine and characterize common vehicle dynamics trends and relationships based on in-use field data. The results of these analyses statistically define the vehicle dynamic and kinematic requirements for each vehicle, aiding in the selection of representative chassis dynamometer test cycles and the development of custom drive cycles that emulate daily operation. In this paper, the methodology and accompanying results of the duty-cycle statistical analysis are presented and discussed. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relate to medium duty EVs.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.
International petroleum statistics report, January 1992. [Contains Glossary
1992-01-01
The International Petroleum Statistics Report presents data on international oil production, consumption, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil consumption and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most two years. Section 2 presents an oil supply/consumption balance for the market economies (i.e., non-communist countries). This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, consumption, and trade in OECD countries. World oil production and OECD consumption data are for the years 1970 through 1990; OECD stocks from 1973 through 1990; and OECD trade from 1982 through 1990.
Statistical mechanics of self-driven Carnot cycles
Smith, E.
1999-10-01
The spontaneous generation and finite-amplitude saturation of sound, in a traveling-wave thermoacoustic engine, are derived as properties of a second-order phase transition. It has previously been argued that this dynamical phase transition, called {open_quotes}onset,{close_quotes} has an equivalent equilibrium representation, but the saturation mechanism and scaling were not computed. In this work, the sound modes implementing the engine cycle are coarse-grained and statistically averaged, in a partition function derived from microscopic dynamics on criteria of scale invariance. Self-amplification performed by the engine cycle is introduced through higher-order modal interactions. Stationary points and fluctuations of the resulting phenomenological Lagrangian are analyzed and related to background dynamical currents. The scaling of the stable sound amplitude near the critical point is derived and shown to arise universally from the interaction of finite-temperature disorder, with the order induced by self-amplification. {copyright} {ital 1999} {ital The American Physical Society}
Statistical properties of super-hot solar flares
Caspi, Amir; Krucker, Säm; Lin, R. P.
2014-01-20
We use Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) high-resolution imaging and spectroscopy observations from ?6 to 100 keV to determine the statistical relationships between measured parameters (temperature, emission measure, etc.) of hot, thermal plasma in 37 intense (GOES M- and X-class) solar flares. The RHESSI data, most sensitive to the hottest flare plasmas, reveal a strong correlation between the maximum achieved temperature and the flare GOES class, such that 'super-hot' temperatures >30 MK are achieved almost exclusively by X-class events; the observed correlation differs significantly from that of GOES-derived temperatures, and from previous studies. A nearly ubiquitous association with high emission measures, electron densities, and instantaneous thermal energies suggests that super-hot plasmas are physically distinct from cooler, ?10-20 MK GOES plasmas, and that they require substantially greater energy input during the flare. High thermal energy densities suggest that super-hot flares require strong coronal magnetic fields, exceeding ?100 G, and that both the plasma ? and volume filling factor f cannot be much less than unity in the super-hot region.
Plutonium metal exchange program : current status and statistical analysis
Tandon, L.; Eglin, J. L.; Michalak, S. E.; Picard, R. R.; Temer, D. J.
2004-01-01
The Rocky Flats Plutonium (Pu) Metal Sample Exchange program was conducted to insure the quality and intercomparability of measurements such as Pu assay, Pu isotopics, and impurity analyses. The Rocky Flats program was discontinued in 1989 after more than 30 years. In 2001, Los Alamos National Laboratory (LANL) reestablished the Pu Metal Exchange program. In addition to the Atomic Weapons Establishment (AWE) at Aldermaston, six Department of Energy (DOE) facilities Argonne East, Argonne West, Livermore, Los Alamos, New Brunswick Laboratory, and Savannah River are currently participating in the program. Plutonium metal samples are prepared and distributed to the sites for destructive measurements to determine elemental concentration, isotopic abundance, and both metallic and nonmetallic impurity levels. The program provides independent verification of analytical measurement capabilies for each participating facility and allows problems in analytical methods to be identified. The current status of the program will be discussed with emphasis on the unique statistical analysis and modeling of the data developed for the program. The discussion includes the definition of the consensus values for each analyte (in the presence and absence of anomalous values and/or censored values), and interesting features of the data and the results.
Financial statistics major US publicly owned electric utilities 1996
1998-03-01
The 1996 edition of The Financial Statistics of Major US Publicly Owned Electric Utilities publication presents 5 years (1992 through 1996) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. Five years of summary financial data are provided. Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. 2 figs., 32 tabs.
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; et al
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or â€œquakesâ€. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects â€œtuned criticalâ€ behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simplemoreÂ Â» mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.Â«Â less
Structure Learning and Statistical Estimation in Distribution Networks - Part I
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by proliferation of the metering technology, we discuss estimation problems in structurally loopy but operationally radial distribution grids from measurements, e.g. voltage data, which are either already available or can be made available with a relatively minor investment. In Part I, the objective is to learn the operational layout of the grid. Part II of this paper presents algorithms that estimate load statistics or line parameters in addition to learning the grid structure. Further, Part II discusses the problem of structure estimation for systems with incomplete measurement sets. Our newly suggested algorithms apply to a wide range of realistic scenarios. The algorithms are also computationally efficient â€“ polynomial in timeâ€“ which is proven theoretically and illustrated computationally on a number of test cases. The technique developed can be applied to detect line failures in real time as well as to understand the scope of possible adversarial attacks on the grid.
Development of a statistically based access delay timeline methodology.
Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt
2013-02-01
The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversary's task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.
Statistical study of reconnection exhausts in the solar wind
Enžl, J.; P?ech, L.; Šafránková, J.; N?me?ek, Z.
2014-11-20
Magnetic reconnection is a fundamental process that changes magnetic field configuration and converts a magnetic energy to flow energy and plasma heating. This paper presents a survey of the plasma and magnetic field parameters inside 418 reconnection exhausts identified in the WIND data from 1995-2012. The statistical analysis is oriented on the re-distribution of the magnetic energy released due to reconnection between a plasma acceleration and its heating. The results show that both the portion of the energy deposited into heat as well as the energy spent on the acceleration of the exhaust plasma rise with the magnetic shear angle in accord with the increase of the magnetic flux available for reconnection. The decrease of the normalized exhaust speed with the increasing magnetic shear suggests a decreasing efficiency of the acceleration and/or the increasing efficiency of heating in high-shear events. However, we have found that the already suggested relation between the exhaust speed and temperature enhancement would be rather considered as an upper limit of the plasma heating during reconnection regardless of the shear angle.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B. (Aurora, IL); Garcia, Humberto E. (Idaho Falls, ID); Chen, Frederick W. (Naperville, IL)
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Glueballs and statistical mechanics of the gluon plasma
Brau, Fabian; Buisseret, Fabien
2009-06-01
We study a pure gluon plasma in the context of quasiparticle models, where the plasma is considered as an ideal gas of massive bosons. In order to reproduce SU(3) gauge field lattice data within such a framework, we review briefly the necessity to use a temperature-dependent gluon mass which accounts for color interactions between the gluons near T{sub c} and agrees with perturbative QCD at large temperatures. Consequently, we discuss the thermodynamics of systems with temperature-dependent Hamiltonians and clarify the situation about the possible solutions proposed in the literature to treat those systems consistently. We then focus our attention on two possible formulations which are thermodynamically consistent, and we extract the gluon mass from the equation of state obtained in SU(3) lattice QCD. We find that the thermal gluon mass is similar in both statistical formalisms. Finally, an interpretation of the gluon plasma as an ideal gas made of glueballs and gluons is also presented. The glueball mass is consistently computed within a relativistic formalism using a potential obtained from lattice QCD. We find that the gluon plasma might be a glueball-rich medium for T < or approx. 1.13T{sub c} and suggest that glueballs could be detected in future experiments dedicated to quark-gluon plasma.
NEPA litigation 1988-1995: A detailed statistical analysis
Reinke, D.C.; Robitaille, P.
1997-08-01
The intent of this study was to identify trends and lessons learned from litigated NEPA documents and to compare and contrast these trends among Federal agencies. More than 350 NEPA cases were collected, reviewed, and analyzed. Of the NEPA cases reviewed, more than 170 were appeals or Supreme Court cases, mostly from the late 1980s through 1995. For this time period, the sampled documents represent the majority of the appeals court cases and all the Supreme Court cases. Additionally, over 170 district court cases were also examined as a representative sample of district court decisions on NEPA. Cases on agency actions found to need NEPA documentation (but that had no documentation) and cases on NEPA documents that were found to be inadequate were pooled and examined to determine the factors that were responsible for these findings. The inadequate documents were specifically examined to determine if there were any general trends. The results are shown in detailed statistical terms. Generally, when a Federal agency has some type of NEPA documentation (e.g., CX, EA, or EIS) and at least covers the basic NEPA procedural requirements, the agency typically wins the litigation. NEPA documents that lose generally have serious errors of omission. An awareness and understanding of the errors of omission can help Federal agencies to ensure that they produce winner a greater percentage of the time.
A statistical approach to designing mitigation for induced ac voltages
Dabkowski, J. [Electro Sciences, Inc., Crystal Lake, IL (United States)
1996-08-01
Induced voltage levels on buried pipelines collocated with overhead electric power transmission lines are usually mitigated by means of grounding the pipeline. Maximum effectiveness is obtained when grounds are placed at discrete locations along the pipeline where the peak induced voltages occur. The degree of mitigation achieved is dependent upon the local soil resistivity at these locations. On occasion it may be necessary to employ an extensive distributed grounding system, for example, a parallel buried wire connected to the pipe at periodic intervals. In this situation the a priori calculation of mitigated voltage levels is sometimes made assuming an average value for the soil resistivity. Over long distances, however, the soil resistivity generally varies as a log-normally distributed random variable. The effect of this variability upon the predicted mitigated voltage levels is examined. It is found that the predicted levels exhibit a statistical variability which precludes a precise determination of the mitigated voltage levels. Thus, post commissioning testing of the emplaced mitigation system is advisable.
ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS
National Nuclear Security Administration (NNSA)
1/18/2011 DOE NNSA Service Center 2011 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 12 12 0 0 0 0 8 8 0 0 0 0 0 0 3 3 0 0 0 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 12 0 0 0 0 0 0 12 0 12 0 12 0 3 3 2 2 0.00 7 7 0 0 0 0 0 0.00 0.00 0 0 0 0 0 0.00 0 0.00 0 0 0 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS (REPORTING PERIOD BEGINS OCTOBER 1ST AND ENDS
A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE
Hillier, A.; Morton, R. J.; Erdélyi, R.
2013-12-20
The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000 s with typical velocity amplitudes ranging between 0.2 and 23 km s{sup –1}. The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7 mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency.
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; Denisov, Dmitry; Schall, Peter; Gu, Xiaojun; Wright, Wendelin J.; Hufnagel, Todd; Jennings, Andrew; Greer, Julia R.; Liaw, P. K.; Becker, Thorsten; Dresen, Georg; Dahmen, Karin A.
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or â€œquakesâ€. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects â€œtuned criticalâ€ behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.
Signatures of initial state modifications on bispectrum statistics
Meerburg, P Daniel; Schaar, Jan Pieter van der; Corasaniti, Pier Stefano E-mail: j.p.vanderschaar@uva.nl
2009-05-15
Modifications of the initial-state of the inflaton field can induce a departure from Gaussianity and leave a testable imprint on the higher order correlations of the CMB and large scale structures in the Universe. We focus on the bispectrum statistics of the primordial curvature perturbation and its projection on the CMB. For a canonical single-field action the three-point correlator enhancement is localized, maximizing in the collinear limit, corresponding to enfolded or squashed triangles in comoving momentum space. We show that the available local and equilateral template are very insensitive to this localized enhancement and do not generate noteworthy constraints on initial-state modifications. On the other hand, when considering the addition of a dimension 8 higher order derivative term, we find a dominant rapidly oscillating contribution, which had previously been overlooked and whose significantly enhanced amplitude is independent of the triangle under consideration. Nevertheless, the oscillatory nature of (the sign of) the correlation function implies the signal is nearly orthogonal to currently available observational templates, strongly reducing the sensitivity to the enhancement. Constraints on departures from the standard Bunch-Davies vacuum state can be derived, but also depend on the next-to-leading terms. We emphasize that the construction and application of especially adapted templates could lead to CMB bispectrum constraints on modified initial states already competing with those derived from the power spectrum.
Financial statistics of major US publicly owned electric utilities 1992
Not Available
1994-01-01
The 1992 edition of the Financial Statistics of Major US Publicly Owned Electric Utilities publication presents 4 years (1989 through 1992) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. Four years of summary financial data are provided. Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. The primary source of publicly owned financial data is the Form EIA-412, {open_quotes}Annual Report of Public Electric Utilities.{close_quotes} Public electric utilities file this survey on a fiscal year, rather than a calendar year basis, in conformance with their recordkeeping practices. In previous editions of this publication, data were aggregated by the two most commonly reported fiscal years, June 30 and December 31. This omitted approximately 20 percent of the respondents who operate on fiscal years ending in other months. Accordingly, the EIA undertook a review of the Form EIA-412 submissions to determine if alternative classifications of publicly owned electric utilities would permit the inclusion of all respondents.
Statistical behavior in deterministic quantum systems with few degrees of freedom
Jensen, R.V.; Shankar, R.
1985-04-29
Numerical studies of the dynamics of finite quantum spin chains are presented which show that quantum systems with few degrees of freedom (N = 7) can be described by equilibrium statistical mechanics. The success of the statistical description is seen to depend on the interplay between the initial state, the observable, and the Hamiltonian. This work clarifies the impact of integrability and conservation laws on statistical behavior. The relation to quantum chaos is also discussed.
Statistical Mechanics of Prion Diseases (Journal Article) | SciTech Connect
Office of Scientific and Technical Information (OSTI)
Statistical Mechanics of Prion Diseases Citation Details In-Document Search Title: Statistical Mechanics of Prion Diseases We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale
Photon-number statistics of twin beams: Self-consistent measurement, reconstruction, and properties
Pe?ina, Jan Jr.; Haderka, Ond?ej; Michálek, Václav
2014-12-04
A method for the determination of photon-number statistics of twin beams using the joint signal-idler photocount statistics obtained by an iCCD camera is described. It also provides absolute quantum detection efficiency of the camera. Using the measured photocount statistics, quasi-distributions of integrated intensities are obtained. They attain negative values occurring in characteristic strips an a consequence of pairing of photons in twin beams.
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
Term Maintenance: Final Technical Report Citation Details In-Document Search Title: High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final ...
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Baltay, Charles 79 ASTRONOMY AND ASTROPHYSICS Study of Type 1a Supernovae...
On the ability of Order Statistics to distinguish different models for continuum gamma decay
Sandoval, J. J.; Cristancho, F.
2007-10-26
A simulation procedure to calculate some important parameters to the application of Order Statistics in the analysis of continuum gamma decay is presented.
Investigation of statistical iterative reconstruction for dedicated breast CT
Makeev, Andrey; Glick, Stephen J.
2013-08-15
Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images were compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue.Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 ?m microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters.Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose.Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose.
Effects of the vacuum state on the statistics of nonclassical states
Alioui, N.; Amroun-Frahi, A.; Bendjaballah, C.
2007-10-15
Based on the calculation of the Wigner function, some statistical properties of the superposition of two coherent states with a vacuum state are demonstrated. The distance variation difference function is calculated for these states. Application of homodyne statistics shows that the addition (subtraction) of the vacuum state can improve the classical channel capacity of a noiseless binary symmetric system.
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-06-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre
2010-03-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and X{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Physics-based statistical model and simulation method of RF propagation in urban environments
Pao, Hsueh-Yuan; Dvorak, Steven L.
2010-09-14
A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.
RHIC POWER SUPPLIES-FAILURE STATISTICS FOR RUNS 4, 5, AND 6
BRUNO,D.; GANETIS, G.; SANDBERG, J.; LOUIE, W.; HEPPNER, G.; SCHULTHEISS, C.
2007-06-25
The two rings in the Relativistic Heavy Ion Collider (RFIIC) require a total of 933 power supplies to supply current to highly inductive superconducting magnets. Failure statistics for the RHIC power supplies will be failure associated with the CEPS group's responsibilities. presented for the last three RHIC runs. The failures of the power supplies will be analyzed. The statistics associated with the power supply failures will be presented. Comparisons of the failure statistics for the last three RHIC runs will be shown. Improvements that have increased power supply availability will be discussed.
Plasma analogy and non-Abelian statistics for Ising-type quantum Hall
Office of Scientific and Technical Information (OSTI)
states (Journal Article) | SciTech Connect Plasma analogy and non-Abelian statistics for Ising-type quantum Hall states Citation Details In-Document Search Title: Plasma analogy and non-Abelian statistics for Ising-type quantum Hall states We study the non-Abelian statistics of quasiparticles in the Ising-type quantum Hall states which are likely candidates to explain the observed Hall conductivity plateaus in the second Landau level, most notably the one at filling fraction {nu}=5/2. We
Statistics of resonance fluorescence of a pair of atoms in a feedback loop
Tomilin, V. A. Il'ichev, L. V.
2013-02-15
The statistics of photoemission events of a pair of closely spaced two-level atoms is calculated in a classical light field whose phase is changed by {pi} after the detection of each spontaneous photon. This statistics is compared with the statistics in the case when the feedback is missing. In both cases, one can observe noticeable antibunching of photons in the range of parameters where no antibunching is observed in a single-atom system. The feedback substantially increases the antibunching. This effect manifests itself more strongly in relatively weak fields and for considerable frequency detunings.
A Statistical Analysis Of Bottom-Hole Temperature Data In The...
Statistical Analysis Of Bottom-Hole Temperature Data In The Hinton Area Of West-Central Alberta Jump to: navigation, search OpenEI Reference LibraryAdd to library Journal Article:...
Spin chains and Arnold's problem on the Gauss-Kuz'min statistics for quadratic irrationals
Ustinov, Alexey V
2013-05-31
New results related to number theoretic model of spin chains are proved. We solve Arnold's problem on the Gauss-Kuz'min statistics for quadratic irrationals. Bibliography: 24 titles.
2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
-1 12. Project and Program Statistics Calculations Overview A numerical evaluation of each project within each subprogram area and a comparison to the other projects within the subprogram area necessitates a statistical comparison of the projects utilizing specific criteria. For each project, a representative set of experts in the project's field were selected to evaluate the project based upon the criteria indicated in the Introduction. Each evaluation criterion's sample mean and variance were
Random-matrix approach to the statistical compound nuclear reaction at low
Office of Scientific and Technical Information (OSTI)
energies using the Monte-Carlo technique (Conference) | SciTech Connect Conference: Random-matrix approach to the statistical compound nuclear reaction at low energies using the Monte-Carlo technique Citation Details In-Document Search Title: Random-matrix approach to the statistical compound nuclear reaction at low energies using the Monte-Carlo technique Authors: Kawano, Toshihiko [1] + Show Author Affiliations Los Alamos National Laboratory Publication Date: 2015-11-10 OSTI Identifier:
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term
Office of Scientific and Technical Information (OSTI)
Maintenance: Final Technical Report (Technical Report) | SciTech Connect Technical Report: High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Citation Details In-Document Search Title: High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report The Quest Camera was installed at the Palomar Obervatory in California. The camera was used to carry out a survey of low redshift Type 1a
Experimental and Statistical Comparison of Engine Response as a Function of
Office of Scientific and Technical Information (OSTI)
Fuel Chemistry and Properties in CI and HCCI Engines (Conference) | SciTech Connect Experimental and Statistical Comparison of Engine Response as a Function of Fuel Chemistry and Properties in CI and HCCI Engines Citation Details In-Document Search Title: Experimental and Statistical Comparison of Engine Response as a Function of Fuel Chemistry and Properties in CI and HCCI Engines Authors: Bunce, Michael [1] ; Bunting, Bruce G [1] ; Crawford, Robert W [2] + Show Author Affiliations ORNL
A statistical perspective of validation and UQ (Conference) | SciTech
Office of Scientific and Technical Information (OSTI)
Connect statistical perspective of validation and UQ Citation Details In-Document Search Title: A statistical perspective of validation and UQ Authors: Higdon, David M [1] + Show Author Affiliations Los Alamos National Laboratory [Los Alamos National Laboratory Publication Date: 2011-10-24 OSTI Identifier: 1114420 Report Number(s): LA-UR-11-06111; LA-UR-11-6111 DOE Contract Number: AC52-06NA25396 Resource Type: Conference Resource Relation: Conference: Workshop on Verification and
A statistical perspective of validation and UQ (Conference) | SciTech
Office of Scientific and Technical Information (OSTI)
Connect statistical perspective of validation and UQ Citation Details In-Document Search Title: A statistical perspective of validation and UQ Ã— You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize additional information resources in energy science and technology. A paper copy of this document is also available for
Basics of Bayesian Statistics and Emulation (Conference) | SciTech Connect
Office of Scientific and Technical Information (OSTI)
Basics of Bayesian Statistics and Emulation Citation Details In-Document Search Title: Basics of Bayesian Statistics and Emulation Authors: Lawrence, Earl Christopher [1] + Show Author Affiliations Los Alamos National Laboratory Publication Date: 2015-07-29 OSTI Identifier: 1207755 Report Number(s): LA-UR-15-25984 DOE Contract Number: AC52-06NA25396 Resource Type: Conference Resource Relation: Conference: Santa Fe Cosmology Workshop ; 2015-07-24 - 2015-07-24 ; Santa Fe, New Mexico, United States
Super-Poissonian Statistics of Photon Emission from Single CdSe-CdS
Office of Scientific and Technical Information (OSTI)
Core-Shell Nanocrystals Coupled to Metal Nanostructures (Journal Article) | SciTech Connect Super-Poissonian Statistics of Photon Emission from Single CdSe-CdS Core-Shell Nanocrystals Coupled to Metal Nanostructures Citation Details In-Document Search Title: Super-Poissonian Statistics of Photon Emission from Single CdSe-CdS Core-Shell Nanocrystals Coupled to Metal Nanostructures Authors: Park, Young-Shin ; Ghosh, Yagnaseni ; Chen, Yongfen ; Piryatinski, Andrei ; Xu, Ping ; Mack, Nathan H. ;
Statistics Show Bearing Problems Cause the Majority of Wind Turbine Gearbox
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Failures | Department of Energy Statistics Show Bearing Problems Cause the Majority of Wind Turbine Gearbox Failures Statistics Show Bearing Problems Cause the Majority of Wind Turbine Gearbox Failures September 17, 2015 - 12:29pm Addthis In the past, the wind energy industry has been relatively conservative in terms of data sharing, especially with the general public, which has inhibited the research community's efforts to identify and mitigate the premature failures of wind turbine
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.; Chilton, Lawrence
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data from the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Storage Trends and Summaries Storage by Scientific Discipline Troubleshooting IO ... Storage Trends and Summaries Total Bytes Utilized The growth in NERSC's storage systems ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Emily Casleton David Collins Michael L. Fugate James R. Gattiker, Deputy Group Leader Michael S. Hamada Geralyn M. Hemphill Aparna V. Huzurbazar Elizabeth J. Kelly Earl C....
Office of Scientific and Technical Information (OSTI)
and Sustainable Energy V v y Jo ur na l Renewable Electronic structural and electroch em ical properties of lithium zircon a tes and their capabilities of C 0 2 capture: A first-principles density-functional theory and phonon d y n am ics approach Yuhua Duan Citation: J. Renewable Sustainable Energy 3, 013102 (2011); doi: 10.1063/1.3529427 View online: http://dx.doi.Org/10.1063/1.3529427 View Table of Contents: http://jrse.aip.Org/resource/1/JRSEBH/v3/i1 Published by the American Institute of
2010 Renewable Energy Data Book
Broader source: Energy.gov [DOE]
The annual report is an important assessment of U.S. energy statistics for 2010, including renewable electricity, worldwide renewable energy development, clean energy investments, and data on specific technologies. The 2010 Renewable Energy Data Book is filled with information-packed charts and graphics, which allows users, from analysts to policymakers, to quickly understand and summarize trends in renewable energy -- both on a U.S. and global scale.
2011 Renewable Energy Data Book
Broader source: Energy.gov [DOE]
The annual report is an important assessment of U.S. energy statistics for 2011, including renewable electricity, worldwide renewable energy development, clean energy investments, and data on specific technologies. The 2011 Renewable Energy Data Book is filled with information-packed charts and graphics, which allows users, from analysts to policymakers, to quickly understand and summarize trends in renewable energy -- both on a U.S. and global scale.
2012 Renewable Energy Data Book
Broader source: Energy.gov [DOE]
The annual report is an important assessment of U.S. energy statistics for 2012, including renewable electricity, worldwide renewable energy development, clean energy investments, and data on specific technologies. The 2012 Renewable Energy Data Book is filled with information-packed charts and graphics, which allows users, from analysts to policymakers, to quickly understand and summarize trends in renewable energy -- both on a U.S. and global scale.
2013 Renewable Energy Data Book
Broader source: Energy.gov [DOE]
The annual report is an important assessment of U.S. energy statistics for 2013, including renewable electricity, worldwide renewable energy development, clean energy investments, and data on specific technologies. The 2013 Renewable Energy Data Book is filled with information-packed charts and graphics, which allows users, from analysts to policymakers, to quickly understand and summarize trends in renewable energy -- both on a U.S. and global scale.
STATISTICAL ANALYSIS OF CURRENT SHEETS IN THREE-DIMENSIONAL MAGNETOHYDRODYNAMIC TURBULENCE
Zhdankin, Vladimir; Boldyrev, Stanislav; Uzdensky, Dmitri A.; Perez, Jean C. E-mail: boldyrev@wisc.edu E-mail: jcperez@wisc.edu
2013-07-10
We develop a framework for studying the statistical properties of current sheets in numerical simulations of magnetohydrodynamic (MHD) turbulence with a strong guide field, as modeled by reduced MHD. We describe an algorithm that identifies current sheets in a simulation snapshot and then determines their geometrical properties (including length, width, and thickness) and intensities (peak current density and total energy dissipation rate). We then apply this procedure to simulations of reduced MHD and perform a statistical analysis on the obtained population of current sheets. We evaluate the role of reconnection by separately studying the populations of current sheets which contain magnetic X-points and those which do not. We find that the statistical properties of the two populations are different in general. We compare the scaling of these properties to phenomenological predictions obtained for the inertial range of MHD turbulence. Finally, we test whether the reconnecting current sheets are consistent with the Sweet-Parker model.
Wurtz, R.; Kaplan, A.
2015-10-28
Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-Ârealized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-Âbuilding elements and their functions in a fully-Âdesigned and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifierâ€™s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.
Full counting statistics of energy fluctuations in a driven quantum resonator
Clerk, A. A.
2011-10-15
We consider the statistics of time-integrated energy fluctuations of a driven bosonic single-mode resonator, as measured by a quantum nondemolition (QND) detector, using the standard Keldysh prescription to define higher moments. We find that, due to an effective cascading of fluctuations, these statistics are surprisingly nonclassical: the low-temperature, quantum probability distribution is not equivalent to the high-temperature classical distribution evaluated at some effective temperature. Moreover, for a sufficiently large drive detuning and low temperatures, the Keldysh-ordered quasiprobability distribution characterizing these fluctuations fails to be positive-definite; this is similar to the full counting statistics of charge in superconducting systems. We argue that this indicates a kind of nonclassical behavior akin to that tested by Leggett-Garg inequalities.
A NEW METHOD TO CORRECT FOR FIBER COLLISIONS IN GALAXY TWO-POINT STATISTICS
Guo Hong; Zehavi, Idit; Zheng Zheng
2012-09-10
In fiber-fed galaxy redshift surveys, the finite size of the fiber plugs prevents two fibers from being placed too close to one another, limiting the ability to study galaxy clustering on all scales. We present a new method for correcting such fiber collision effects in galaxy clustering statistics based on spectroscopic observations. The target galaxy sample is divided into two distinct populations according to the targeting algorithm of fiber placement, one free of fiber collisions and the other consisting of collided galaxies. The clustering statistics are a combination of the contributions from these two populations. Our method makes use of observations in tile overlap regions to measure the contributions from the collided population, and to therefore recover the full clustering statistics. The method is rooted in solid theoretical ground and is tested extensively on mock galaxy catalogs. We demonstrate that our method can well recover the projected and the full three-dimensional (3D) redshift-space two-point correlation functions (2PCFs) on scales both below and above the fiber collision scale, superior to the commonly used nearest neighbor and angular correction methods. We discuss potential systematic effects in our method. The statistical correction accuracy of our method is only limited by sample variance, which scales down with (the square root of) the volume probed. For a sample similar to the final SDSS-III BOSS galaxy sample, the statistical correction error is expected to be at the level of 1% on scales {approx}0.1-30 h {sup -1} Mpc for the 2PCFs. The systematic error only occurs on small scales, caused by imperfect correction of collision multiplets, and its magnitude is expected to be smaller than 5%. Our correction method, which can be generalized to other clustering statistics as well, enables more accurate measurements of full 3D galaxy clustering on all scales with galaxy redshift surveys.
Li, Dongsheng; Khaleel, Mohammad A.; Sun, Xin; Garmestani, Hamid
2010-03-01
Statistical correlation function, including two-point function, is one of the popular methods to digitize microstructure quantitatively. This paper investigated how to represent statistical correlations using layered fast spherical harmonics expansion. A set of spherical harmonics coefficients may be used to represent the corresponding microstructures. It is applied to represent carbon nanotube composite microstructures to demonstrate how efficiently and precisely the harmonics coefficients will characterize the microstructure. This microstructure representation methodology will dramatically improve the computational efficiencies for future works in microstructure reconstruction and property prediction.
Steffen, Jason H.; Ford, Eric B.; Rowe, Jason F.; Fabrycky, Daniel C.; Holman, Matthew J.; Welsh, William F.; Borucki, William J.; Batalha, Natalie M.; Bryson, Steve; Caldwell, Douglas A.; Ciardi, David R.; /Caltech /NASA, Ames /SETI Inst., Mtn. View
2012-01-01
We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through Quarter six (Q6) of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies.
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach (Journal
Office of Scientific and Technical Information (OSTI)
Article) | SciTech Connect Masked Areas in Shear Peak Statistics: A Forward Modeling Approach Citation Details In-Document Search Title: Masked Areas in Shear Peak Statistics: A Forward Modeling Approach Authors: Bard, D. ; /KIPAC, Menlo Park ; Kratochvil, J.M. ; /KwaZulu Natal U. ; Dawson, W. ; /LLNL, Livermore Publication Date: 2016-02-18 OSTI Identifier: 1238567 Report Number(s): SLAC-PUB-16483 arXiv:1410.5446 DOE Contract Number: AC02-76SF00515 Resource Type: Journal Article Resource
Statistical techniques for characterizing residual waste in single-shell and double-shell tanks
Jensen, L., Fluor Daniel Hanford
1997-02-13
A primary objective of the Hanford Tank Initiative (HTI) project is to develop methods to estimate the inventory of residual waste in single-shell and double-shell tanks. A second objective is to develop methods to determine the boundaries of waste that may be in the waste plume in the vadose zone. This document presents statistical sampling plans that can be used to estimate the inventory of analytes within the residual waste within a tank. Sampling plans for estimating the inventory of analytes within the waste plume in the vadose zone are also presented. Inventory estimates can be used to classify the residual waste with respect to chemical and radiological hazards. Based on these estimates, it will be possible to make decisions regarding the final disposition of the residual waste. Four sampling plans for the residual waste in a tank are presented. The first plan is based on the assumption that, based on some physical characteristic, the residual waste can be divided into disjoint strata, and waste samples obtained from randomly selected locations within each stratum. The second plan is that waste samples are obtained from randomly selected locations within the waste. The third and fourth plans are similar to the first two, except that composite samples are formed from multiple samples. Common to the four plans is that, in the laboratory, replicate analytical measurements are obtained from homogenized waste samples. The statistical sampling plans for the residual waste are similar to the statistical sampling plans developed for the tank waste characterization program. In that program, the statistical sampling plans required multiple core samples of waste, and replicate analytical measurements from homogenized core segments. A statistical analysis of the analytical data, obtained from use of the statistical sampling plans developed for the characterization program or from the HTI project, provide estimates of mean analyte concentrations and confidence intervals on the mean. In addition, the statistical analysis provides estimates of spatial and measurement variabilities. The magnitude of these sources of variability are used to determine how well the inventory of the analytes in the waste have been estimated. This document provides statistical sampling plans that can be used to estimate the inventory of the analytes in the residual waste in single-shell and double-shell tanks and in the waste plume in the vadose zone.
Fact #602: December 21, 2009 Freight Statistics by Mode, 2007 Commodity
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Flow Survey | Department of Energy 2: December 21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Fact #602: December 21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Results from the 2007 Commodity Flow Survey (CFS) show that about 70% of all freight movement in the U.S. is by truck, in terms of the shipment value and tonnage. Rail moves about 15% of freight tons, but moves those tons over great distances, accounting for 37% of ton-miles. Parcel delivery, US
Classification of generalized quantum statistics associated with the exceptional Lie (super)algebras
Stoilova, N. I.; Jeugt, J. van der
2007-04-15
Generalized quantum statistics (GQS) associated with a Lie algebra or Lie superalgebra extends the notion of para-Bose or para-Fermi statistics. Such GQS have been classified for all classical simple Lie algebras and basic classical Lie superalgebras. In the current paper we finalize this classification for all exceptional Lie algebras and superalgebras. Since the definition of GQS is closely related to a certain Z grading of the Lie (super)algebra G, our classification reproduces some known Z gradings of exceptional Lie algebras. For exceptional Lie superalgebras such a classification of Z gradings has not been given before.
Statistical Properties of Inter-Series Mixing in Helium: From Integrability to Chaos
Pu''ttner, R.; Gremaud, B.; Delande, D.; Domke, M.; Martins, M.; Schlachter, A. S.; Kaindl, G.
2001-04-23
The photoionization spectrum of helium shows considerable complexity close to the double-ionization threshold. By analyzing the results from both our recent experiments and ab initio three- and one-dimensional calculations, we show that the statistical properties of the spacings between neighboring energy levels clearly display a transition towards quantum chaos.
Statistical theory of Coulomb blockade oscillations: Quantum chaos in quantum dots
Jalabert, R.A.; Stone, A.D.; Alhassid, Y. (Center for Theoretical Physics, Sloane Physics Laboratory, Yale University, New Haven, Connecticut 06511 (United States))
1992-06-08
We develop a statistical theory of the amplitude of Coulomb blockade oscillations in semiconductor quantum dots based on the hypothesis that chaotic dynamics in the dot potential leads to behavior described by random-matrix theory. Breaking time-reversal symmetry is predicted to cause an experimentally observable change in the distribution of amplitudes. The theory is tested numerically and good agreement is found.
Short-Term Arctic Cloud Statistics at NSA from the Infrared Cloud...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
10-20% 20-30% 30-40% 40-50% 50-60% 60-70% 70-80% 80-90% 90-100% Figure 4. Monthly cloud statistics. (March data limited to the last two weeks) Acknowledgment The ICI system was...
Impact of high-order moments on the statistical modeling of transition arrays
Gilleron, Franck; Pain, Jean-Christophe; Bauche, Jacques; Bauche-Arnoult, Claire
2008-02-15
The impact of high-order moments on the statistical modeling of transition arrays in complex spectra is studied. It is shown that a departure from the Gaussian, which is usually employed in such an approach, may be observed even in the shape of unresolved spectra due to the large value of the kurtosis coefficient. The use of a Gaussian shape may also overestimate the width of the spectra in some cases. Therefore, it is proposed to simulate the statistical shape of the transition arrays by the more flexible generalized Gaussian distribution which introduces an additional parameter--the power of the argument in the exponential--that can be constrained by the kurtosis value. The relevance of the statistical line distribution is checked by comparisons with smoothed spectra obtained from detailed line-by-line calculations. The departure from the Gaussian is also confirmed through the analysis of 2p-3d transitions of recent absorption measurements. A numerical fit is proposed for an easy implementation of the statistical profile in atomic-structure codes.
An Application of Multivariate Statistical Analysis for Query-Driven Visualization
Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes; Joy, Kenneth I.
2010-03-01
Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.
Widen, Joakim; Waeckelgaard, Ewa; Paatero, Jukka; Lund, Peter
2010-03-15
The trend of increasing application of distributed generation with solar photovoltaics (PV-DG) suggests that a widespread integration in existing low-voltage (LV) grids is possible in the future. With massive integration in LV grids, a major concern is the possible negative impacts of excess power injection from on-site generation. For power-flow simulations of such grid impacts, an important consideration is the time resolution of demand and generation data. This paper investigates the impact of time averaging on high-resolution data series of domestic electricity demand and PV-DG output and on voltages in a simulated LV grid. Effects of 10-minutely and hourly averaging on descriptive statistics and duration curves were determined. Although time averaging has a considerable impact on statistical properties of the demand in individual households, the impact is smaller on aggregate demand, already smoothed from random coincidence, and on PV-DG output. Consequently, the statistical distribution of simulated grid voltages was also robust against time averaging. The overall judgement is that statistical investigation of voltage variations in the presence of PV-DG does not require higher resolution than hourly. (author)
The statistical properties of Klein-Gordon oscillator in noncommutative space
Hassanabadi, H. Hosseini, S. S.; Boumali, A.; Zarrinkamar, S.
2014-03-15
We study the relativistic spin-zero bosons influenced by the Klein-Gordon oscillator and an external magnetic field in noncommutative formulation. The problem is considered in two dimensions and is solved in an exact analytical manner. Having found the spectrum of the system, the statistical properties of an N-boson system are reported.
Financial statistics of selected publicly owned electric utilities 1989. [Contains glossary
Not Available
1991-02-06
The Financial Statistics of Selected Publicly Owned Electric Utilities publication presents summary and detailed financial accounting data on the publicly owned electric utilities. The objective of the publication is to provide the Federal and State governments, industry, and the general public with data that can be used for policymaking and decision making purposes relating to publicly owned electric utility issues. 21 tabs.
igure 4. Production Schedules at Two Development Rates for the Statistical
U.S. Energy Information Administration (EIA) Indexed Site
Mean of Recovering 10.3 Billion Barrels 4. Production Schedules at Two Development Rates for the Statistical Mean of Recovering 10.3 Billion Barrels of Technically Recoverable Oil from the ANWR Coastal Plain of Alaska fig4.jpg (4109
Lauzier, Pascal Theriault; Chen Guanghong
2013-02-15
Purpose: The ionizing radiation imparted to patients during computed tomography exams is raising concerns. This paper studies the performance of a scheme called dose reduction using prior image constrained compressed sensing (DR-PICCS). The purpose of this study is to characterize the effects of a statistical model of x-ray detection in the DR-PICCS framework and its impact on spatial resolution. Methods: Both numerical simulations with known ground truth and in vivo animal dataset were used in this study. In numerical simulations, a phantom was simulated with Poisson noise and with varying levels of eccentricity. Both the conventional filtered backprojection (FBP) and the PICCS algorithms were used to reconstruct images. In PICCS reconstructions, the prior image was generated using two different denoising methods: a simple Gaussian blur and a more advanced diffusion filter. Due to the lack of shift-invariance in nonlinear image reconstruction such as the one studied in this paper, the concept of local spatial resolution was used to study the sharpness of a reconstructed image. Specifically, a directional metric of image sharpness, the so-called pseudopoint spread function (pseudo-PSF), was employed to investigate local spatial resolution. Results: In the numerical studies, the pseudo-PSF was reduced from twice the voxel width in the prior image down to less than 1.1 times the voxel width in DR-PICCS reconstructions when the statistical model was not included. At the same noise level, when statistical weighting was used, the pseudo-PSF width in DR-PICCS reconstructed images varied between 1.5 and 0.75 times the voxel width depending on the direction along which it was measured. However, this anisotropy was largely eliminated when the prior image was generated using diffusion filtering; the pseudo-PSF width was reduced to below one voxel width in that case. In the in vivo study, a fourfold improvement in CNR was achieved while qualitatively maintaining sharpness; images also had a qualitatively more uniform noise spatial distribution when including a statistical model. Conclusions: DR-PICCS enables to reconstruct CT images with lower noise than FBP and the loss of spatial resolution can be mitigated to a large extent. The introduction of statistical modeling in DR-PICCS may improve some noise characteristics, but it also leads to anisotropic spatial resolution properties. A denoising method, such as the directional diffusion filtering, has been demonstrated to reduce anisotropy in spatial resolution effectively when it was combined with DR-PICCS with statistical modeling.
Connes distance function on fuzzy sphere and the connection between geometry and statistics
Devi, Yendrembam Chaoba Chakraborty, Biswajit; Prajapat, Shivraj; Mukhopadhyay, Aritra K.; Scholtz, Frederik G.
2015-04-15
An algorithm to compute Connes spectral distance, adaptable to the Hilbert-Schmidt operatorial formulation of non-commutative quantum mechanics, was developed earlier by introducing the appropriate spectral triple and used to compute infinitesimal distances in the Moyal plane, revealing a deep connection between geometry and statistics. In this paper, using the same algorithm, the Connes spectral distance has been calculated in the Hilbert-Schmidt operatorial formulation for the fuzzy sphere whose spatial coordinates satisfy the su(2) algebra. This has been computed for both the discrete and the Perelemov’s SU(2) coherent state. Here also, we get a connection between geometry and statistics which is shown by computing the infinitesimal distance between mixed states on the quantum Hilbert space of a particular fuzzy sphere, indexed by n ? ?/2.
On exact statistics and classification of ergodic systems of integer dimension
Guralnik, Zachary Guralnik, Gerald; Pehlevan, Cengiz
2014-06-01
We describe classes of ergodic dynamical systems for which some statistical properties are known exactly. These systems have integer dimension, are not globally dissipative, and are defined by a probability density and a two-form. This definition generalizes the construction of Hamiltonian systems by a Hamiltonian and a symplectic form. Some low dimensional examples are given, as well as a discretized field theory with a large number of degrees of freedom and a local nearest neighbor interaction. We also evaluate unequal-time correlations of these systems without direct numerical simulation, by Padé approximants of a short-time expansion. We briefly speculate on the possibility of constructing chaotic dynamical systems with non-integer dimension and exactly known statistics. In this case there is no probability density, suggesting an alternative construction in terms of a Hopf characteristic function and a two-form.
Financial statistics of major US investor-owned electric utilities 1994
1995-12-01
The Financial Statistics of Major U.S. Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide Federal and State Governments, industry, and the general public with current and historical data that can be used for making policy and decisions relating to investor-owned electric utility issues.
Net-shape fabrication of Y-TZP ceramic through a statistically designed experiment
Ghosh, S.K.; Chatterjee, D.K.; Koziol, D.R.; Majumdar, D.
1993-11-01
Net-shape yttria-doped tetragonal zirconia polycrystal (Y-TZP) ceramic articles of various shapes and dimensions were fabricated using 15 000 psi uniaxial pressure and 1500 C sintering temperature. A statistically designed experiment was conducted to determine the parameters for uniaxial pressure as well as sintering temperature so that shrinkage and dimensions of the sintered parts could be controlled. Shrinkage rate, density, and dimensional tolerance of the articles were greatly influenced by the compacting pressure and the sintering temperature.
Chou, Charissa J; Johnson, Vernon G
2000-03-08
This report updates the original effluent variability study for the 200 Area Treated Effluent Disposal Facility (TEDF) and provides supporting justification for modifying the effluent monitoring portion of the discharge permit. Four years of monitoring data were evaluated and used to statistically justify changes in permit effluent monitoring conditions. As a result, the TEDF effluent composition and variability of the effluent waste stream are now well defined.
Harlim, John; Mahdi, Adam; Majda, Andrew J.
2014-01-15
A central issue in contemporary science is the development of nonlinear data driven statistical–dynamical models for time series of noisy partial observations from nature or a complex model. It has been established recently that ad-hoc quadratic multi-level regression models can have finite-time blow-up of statistical solutions and/or pathological behavior of their invariant measure. Recently, a new class of physics constrained nonlinear regression models were developed to ameliorate this pathological behavior. Here a new finite ensemble Kalman filtering algorithm is developed for estimating the state, the linear and nonlinear model coefficients, the model and the observation noise covariances from available partial noisy observations of the state. Several stringent tests and applications of the method are developed here. In the most complex application, the perfect model has 57 degrees of freedom involving a zonal (east–west) jet, two topographic Rossby waves, and 54 nonlinearly interacting Rossby waves; the perfect model has significant non-Gaussian statistics in the zonal jet with blocked and unblocked regimes and a non-Gaussian skewed distribution due to interaction with the other 56 modes. We only observe the zonal jet contaminated by noise and apply the ensemble filter algorithm for estimation. Numerically, we find that a three dimensional nonlinear stochastic model with one level of memory mimics the statistical effect of the other 56 modes on the zonal jet in an accurate fashion, including the skew non-Gaussian distribution and autocorrelation decay. On the other hand, a similar stochastic model with zero memory levels fails to capture the crucial non-Gaussian behavior of the zonal jet from the perfect 57-mode model.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Infrared Cloud Imager Measurements of Cloud Statistics from the 2003 Cloudiness Intercomparison Campaign B. Thurairajah and J. A. Shaw Department of Electrical and Computer Engineering Montana State University Bozeman, Montana Introduction The Cloudiness Inter-Comparison Intensive Operational Period (CIC IOP) occurred at the Atmospheric Radiation Measurement (ARM), Southern Great Plains (SGP) central facility site in Lamont, Oklahoma from mid-February to mid-April 2003 (Kassianov et al. 2004).
Financial statistics of major U.S. investor-owned electric utilities 1993
Not Available
1995-01-01
The Financial Statistics of Major US Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to investor-owned electric utility issues.
Francois, D.K.
1994-12-31
This document contains statistical data on the following: federal offshore lands; offshore leasing activity and status; offshore development activity; offshore production of crude oil and natural gas; federal offshore oil and natural gas sales volume and royalties; revenue from federal offshore leases; disbursement of federal offshore revenue; reserves and resource estimates of offshore oil and natural gas; oil pollution in US and international waters; and international activities and marine minerals. A glossary is included.
Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Engine Dynamometer Test Cell | Department of Energy Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Effects of ""new"" engine testing procedures (40 CFR Part 1065) with respect to repeatability of transient engine dynamometer tests were examined as well as the effects of calibration and measurement methods PDF icon
Statistical assessment of fish behavior from split-beam hydro-acoustic sampling
McKinstry, Craig A.; Simmons, Mary Ann; Simmons, Carver S.; Johnson, Robert L.
2005-04-01
Statistical methods are presented for using echo-traces from split-beam hydro-acoustic sampling to assess fish behavior in response to a stimulus. The data presented are from a study designed to assess the response of free-ranging, lake-resident fish, primarily kokanee (Oncorhynchus nerka) and rainbow trout (Oncorhynchus mykiss) to high intensity strobe lights, and was conducted at Grand Coulee Dam on the Columbia River in Northern Washington State. The lights were deployed immediately upstream from the turbine intakes, in a region exposed to daily alternating periods of high and low flows. The study design included five down-looking split-beam transducers positioned in a line at incremental distances upstream from the strobe lights, and treatments applied in randomized pseudo-replicate blocks. Statistical methods included the use of odds-ratios from fitted loglinear models. Fish-track velocity vectors were modeled using circular probability distributions. Both analyses are depicted graphically. Study results suggest large increases of fish activity in the presence of the strobe lights, most notably at night and during periods of low flow. The lights also induced notable bimodality in the angular distributions of the fish track velocity vectors. Statistical summaries are presented along with interpretations on fish behavior.
An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-11-01
In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; Spence, H. E.; Lin, R. -L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.
Lombardi, M.; Seligman, T.H. (Laboratoire de Spectrometrie Physique-Universite Joseph Fourier de Grenoble, Boite Postale No. 87, 38402 Saint Martin d'Heres, CEDEX (France))
1993-05-01
We study Rydberg molecules taking into account the interaction between the rotational motion of the nuclei and the radial motion of the electron. This situation can be treated to a good approximation in quantum mechanics by the multichannel quantum-defect method which in turn has a well-defined classical limit. We are able to calculate very long sequences of levels and the corresponding amplitudes of wave packets. This allows us to study the statistical properties of both in detail. Our interest focuses on aspects of quantum chaos'' that can be particularly well understood in this case. Our main result is that, in a completely chaotic classical situation, where statistics of quantum-level spacings follow the expected universal Gaussian-orthogonal-ensemble behavior, and statistics of line intensities display the expected universal Porter-Thomas behavior, nonuniversal properties are explicitly contained in correlations between intensities and spacings, determined by the time needed for the classical system to mix on a length scale given by the quantum wavelength.
Kareem, A.; Zhao, J.
1994-12-31
The nonlinearities in the wind and wave loadings of compliant offshore platforms and in their structural characteristics result in response statistics that deviate from a Gaussian distribution. This paper focuses on the analysis of the response of these structures to random nonlinear wind and wave loads. As an improvement over the commonly used linearization approach an equivalent statistical quadratization (ESQ) and cubicization (ESC) approach is presented. The nonlinear loading or structural characteristics can be expressed in terms of an equivalent polynomial that contains terms up to quadratic or cubic depending on the type of nonlinearity. The response statistics and its cumulants are based on Volterra theory. A direct integration scheme is utilized to evaluate the response cumulants. The results provide a good comparison with simulation. It is noted that the ESQ provides an accurate description of the systems with asymmetrical nonlinearities, whereas, for symmetrical nonlinearities the ESC provides a good representation. Based on the information on higher-order cumulants, the response pdf, crossing rates and peak value distributions can be derived.
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; Spence, H. E.; Lin, R. -L.; Klecker, B.; Dunlop, M. W.; AndrÃ©, M.; Jordanova, V. K.
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001â€“2010) of datamoreÂ Â» from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.Â«Â less
Statistical analysis of multipole components in the magnetic field of the RHIC arc regions
Beebe-Wang,J.; Jain, A.
2009-05-04
The existence of multipolar components in the dipole and quadrupole magnets is one of the factors limiting the beam stability in the RHIC operations. Therefore, the statistical properties of the non-linear fields are crucial for understanding the beam behavior and for achieving the superior performance in RHIC. In an earlier work [1], the field quality analysis of the RHIC interaction regions (IR) was presented. Furthermore, a procedure for developing non-linear IR models constructed from measured multipolar data of RHIC IR magnets was described. However, the field quality in the regions outside of the RHIC IR had not yet been addressed. In this paper, we present the statistical analysis of multipolar components in the magnetic fields of the RHIC arc regions. The emphasis is on the lower order components, especially the sextupole in the arc dipole and the 12-pole in the quadrupole magnets, since they are shown to have the strongest effects on the beam stability. Finally, the inclusion of the measured multipolar components data of RHIC arc regions and their statistical properties into tracking models is discussed.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Wehner, Michael F.; Bala, G.; Duffy, Phillip; Mirin, Arthur A.; Romano, Raquel
2010-01-01
We present a set of high-resolution global atmospheric general circulation model (AGCM) simulations focusing on the model's ability to represent tropical storms and their statistics. We find that the model produces storms of hurricane strength with realistic dynamical features. We also find that tropical storm statistics are reasonable, both globally and in the north Atlantic, when compared to recent observations. The sensitivity of simulated tropical storm statistics to increases in sea surface temperature (SST) is also investigated, revealing that a credible late 21st century SST increase produced increases in simulated tropical storm numbers and intensities in all ocean basins. WhilemoreÂ Â» this paper supports previous high-resolution model and theoretical findings that the frequency of very intense storms will increase in a warmer climate, it differs notably from previous medium and high-resolution model studies that show a global reduction in total tropical storm frequency. However, we are quick to point out that this particular model finding remains speculative due to a lack of radiative forcing changes in our time-slice experiments as well as a focus on the Northern hemisphere tropical storm seasons.Â«Â less
Sanov and central limit theorems for output statistics of quantum Markov chains
Horssen, Merlijn van; Gu??, M?d?lin
2015-02-15
In this paper, we consider the statistics of repeated measurements on the output of a quantum Markov chain. We establish a large deviations result analogous to Sanov’s theorem for the multi-site empirical measure associated to finite sequences of consecutive outcomes of a classical stochastic process. Our result relies on the construction of an extended quantum transition operator (which keeps track of previous outcomes) in terms of which we compute moment generating functions, and whose spectral radius is related to the large deviations rate function. As a corollary to this, we obtain a central limit theorem for the empirical measure. Such higher level statistics may be used to uncover critical behaviour such as dynamical phase transitions, which are not captured by lower level statistics such as the sample mean. As a step in this direction, we give an example of a finite system whose level-1 (empirical mean) rate function is independent of a model parameter while the level-2 (empirical measure) rate is not.
HotPatch Web Gateway: Statistical Analysis of Unusual Patches on Protein Surfaces
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Pettit, Frank K.; Bowie, James U. [DOE-Molecular Biology Institute
HotPatch finds unusual patches on the surface of proteins, and computes just how unusual they are (patch rareness), and how likely each patch is to be of functional importance (functional confidence (FC).) The statistical analysis is done by comparing your protein's surface against the surfaces of a large set of proteins whose functional sites are known. Optionally, HotPatch can also write a script that will display the patches on the structure, when the script is loaded into some common molecular visualization programs. HotPatch generates complete statistics (functional confidence and patch rareness) on the most significant patches on your protein. For each property you choose to analyze, you'll receive an email to which will be attached a PDB-format file in which atomic B-factors (temp. factors) are replaced by patch indices; and the PDB file's Header Remarks will give statistical scores and a PDB-format file in which atomic B-factors are replaced by the raw values of the property used for patch analysis (for example, hydrophobicity instead of hydrophobic patches). [Copied with edits from http://hotpatch.mbi.ucla.edu/
Mask effects on cosmological studies with weak-lensing peak statistics
Liu, Xiangkun; Pan, Chuzhong; Fan, Zuhui; Wang, Qiao
2014-03-20
With numerical simulations, we analyze in detail how the bad data removal, i.e., the mask effect, can influence the peak statistics of the weak-lensing convergence field reconstructed from the shear measurement of background galaxies. It is found that high peak fractions are systematically enhanced because of the presence of masks; the larger the masked area is, the higher the enhancement is. In the case where the total masked area is about 13% of the survey area, the fraction of peaks with signal-to-noise ratio ? ? 3 is ?11% of the total number of peaks, compared with ?7% of the mask-free case in our considered cosmological model. This can have significant effects on cosmological studies with weak-lensing convergence peak statistics, inducing a large bias in the parameter constraints if the effects are not taken into account properly. Even for a survey area of 9 deg{sup 2}, the bias in (? {sub m}, ?{sub 8}) is already intolerably large and close to 3?. It is noted that most of the affected peaks are close to the masked regions. Therefore, excluding peaks in those regions in the peak statistics can reduce the bias effect but at the expense of losing usable survey areas. Further investigations find that the enhancement of the number of high peaks around the masked regions can be largely attributed to the smaller number of galaxies usable in the weak-lensing convergence reconstruction, leading to higher noise than that of the areas away from the masks. We thus develop a model in which we exclude only those very large masks with radius larger than 3' but keep all the other masked regions in peak counting statistics. For the remaining part, we treat the areas close to and away from the masked regions separately with different noise levels. It is shown that this two-noise-level model can account for the mask effect on peak statistics very well, and the bias in cosmological parameters is significantly reduced if this model is applied in the parameter fitting.
Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling
Johannesson, G
2010-03-17
Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that the average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.
Not Available
1990-05-01
This report is a collection of papers documenting presentations made at the VIII ASA (American Statistical Association) Conference on Radiation and Health entitled Health Effects of Electric and Magnetic Fields: Statistical Support for Research Strategies. Individual papers are abstracted and indexed for the database.
Renewable Energy Data Book Details Growing Industry in 2012 | Department of
Energy Renewable Energy Data Book Details Growing Industry in 2012 Renewable Energy Data Book Details Growing Industry in 2012 December 4, 2013 - 12:00am Addthis The National Renewable Energy Laboratory (NREL) on November 21 released the 2012 Renewable Energy Data Book on behalf of the Energy Department's Office of Energy Efficiency and Renewable Energy. The annual report is an important assessment of U.S. energy statistics for 2012, including renewable electricity, worldwide renewable
McManamay, Ryan A
2014-01-01
Despite the ubiquitous existence of dams within riverscapes, much of our knowledge about dams and their environmental effects remains context-specific. Hydrology, more than any other environmental variable, has been studied in great detail with regard to dam regulation. While much progress has been made in generalizing the hydrologic effects of regulation by large dams, many aspects of hydrology show site-specific fidelity to dam operations, small dams (including diversions), and regional hydrologic regimes. A statistical modeling framework is presented to quantify and generalize hydrologic responses to varying degrees of dam regulation. Specifically, the objectives were to 1) compare the effects of local versus cumulative dam regulation, 2) determine the importance of different regional hydrologic regimes in influencing hydrologic responses to dams, and 3) evaluate how different regulation contexts lead to error in predicting hydrologic responses to dams. Overall, model performance was poor in quantifying the magnitude of hydrologic responses, but performance was sufficient in classifying hydrologic responses as negative or positive. Responses of some hydrologic indices to dam regulation were highly dependent upon hydrologic class membership and the purpose of the dam. The opposing coefficients between local and cumulative-dam predictors suggested that hydrologic responses to cumulative dam regulation are complex, and predicting the hydrology downstream of individual dams, as opposed to multiple dams, may be more easy accomplished using statistical approaches. Results also suggested that particular contexts, including multipurpose dams, high cumulative regulation by multiple dams, diversions, close proximity to dams, and certain hydrologic classes are all sources of increased error when predicting hydrologic responses to dams. Statistical models, such as the ones presented herein, show promise in their ability to model the effects of dam regulation effects at large spatial scales as to generalize the directionality of hydrologic responses.
Statistical recoupling: A new way to break the link between electric-utility sales and revenues
Hirst, E.
1993-09-01
In 1991, US electric utilities spent almost $1.8 billion on demand-side management (DSM) programs. These programs cut peak demands 5% and reduced electricity sales 1% that year. Utility projections suggest that these reductions will increase to 9% and 3%, respectively, by the year 2001. However, utility DSM efforts vary enormously across the country, concentrated in a few states along the east and west coasts and the upper midwest. To some extent, this concentration is a function of regulatory reforms that remove disincentives to utility shareholders for investments in DSM programs. A key component of these reforms is recovery of the net lost revenues caused by utility DSM programs. These lost revenues occur between rate cases when a utility encourages its customers to improve energy efficiency and cut demand. The reduction in sales means that the utility has less revenue to cover its fixed costs. This report describes a new method, statistical recoupling (SR), that addresses this net-lost-revenue problem. Like other decoupling approaches, SR breaks the link between electric-utility revenues and sales. Unlike other approaches, SR minimizes changes from traditional regulation. In particular, the risks of revenue swings associated with year-to-year changes in weather and the economy remain with the utility under SR. Statistical recoupling uses statistical models, based on historical data, that explain retail electricity sales as functions of the number of utility customers, winter and summer weather, the condition of the local economy, electricity price, and perhaps a few other key variables. These models, along with the actual values of the explanatory variables, are then used to estimate ``allowed`` electricity sales and revenues in future years.
Akrami, Yashar; Savage, Christopher; Scott, Pat; Conrad, Jan; Edsjö, Joakim E-mail: savage@fysik.su.se E-mail: conrad@fysik.su.se
2011-07-01
Models of weak-scale supersymmetry offer viable dark matter (DM) candidates. Their parameter spaces are however rather large and complex, such that pinning down the actual parameter values from experimental data can depend strongly on the employed statistical framework and scanning algorithm. In frequentist parameter estimation, a central requirement for properly constructed confidence intervals is that they cover true parameter values, preferably at exactly the stated confidence level when experiments are repeated infinitely many times. Since most widely-used scanning techniques are optimised for Bayesian statistics, one needs to assess their abilities in providing correct confidence intervals in terms of the statistical coverage. Here we investigate this for the Constrained Minimal Supersymmetric Standard Model (CMSSM) when only constrained by data from direct searches for dark matter. We construct confidence intervals from one-dimensional profile likelihoods and study the coverage by generating several pseudo-experiments for a few benchmark sets of pseudo-true parameters. We use nested sampling to scan the parameter space and evaluate the coverage for the benchmarks when either flat or logarithmic priors are imposed on gaugino and scalar mass parameters. The sampling algorithm has been used in the configuration usually adopted for exploration of the Bayesian posterior. We observe both under- and over-coverage, which in some cases vary quite dramatically when benchmarks or priors are modified. We show how most of the variation can be explained as the impact of explicit priors as well as sampling effects, where the latter are indirectly imposed by physicality conditions. For comparison, we also evaluate the coverage for Bayesian credible intervals, and observe significant under-coverage in those cases.
www.eia.gov U.S. Energy Information Administration Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Independent Statistics & Analysis Effects of low oil prices For 2015 EIA Energy Conference June 15, 2015 | Washington, DC By Lynn Westfall U.S. Energy Information Administration Effects of Low Oil Prices April 30, 2015 2 0 20 40 60 80 100 120 140 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015 Imported refiner acquisition cost of crude oil Brent crude oil price Historical and projected oil prices Crude oil price price per barrel (real 2015 dollars) Prices shown are quarterly averages:
Statistical comparison of ICRF and NBI heating performance in JET-ILW L-mode plasmas
Lerche, E.; Van Eester, D.; Jacquet, Ph.; Mayoral, M.-L.; Graham, M.; Matthews, G.; Monakhov, I.; Rimini, F.; Colas, L.; Czarnecka, A.; Vries, P. de; Collaboration: JET-EFDA Contributors
2014-02-12
After the change over from the C-wall to the ITER-like Be/W wall (ILW) in JET, the radiation losses during ICRF heating have increased and are now substantially larger than those observed with NBI at the same power levels, in spite of the similar global plasma energies reached with the two heating systems. A comparison of the NBI and ICRF performances in the JET-ILW experiments, based on a statistical analysis of ?3000 L-mode discharges, will be presented.
The effect of power line phase current correlation on magnetic field statistics
Dabkowski, J. [Electro Sciences, Inc., Crystal Lake, IL (United States)
1995-09-01
Due to normally occurring line currents unbalance, the magnetic field strength will fluctuate in time. The minimum field occurs when the phase currents are balanced, i.e. equal in magnitude and equally spaced in angle. The maximum field levels are obtained when the line currents` fluctuations are statistically independent, and hence, uncorrelated. It is shown that the earth return current due to the unbalance, and therefore, the strength of the magnetic field variations are a function of the line`s phase currents correlation. Power lines whose phase currents are highly correlated will produce a smaller increase in the magnetic field levels for a given percentage of current unbalance.
Statistical Design of Experiment for Li-ion Cell Formation Parameters using
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
ÂGen3ÂŽ Electrode Materials: Final Summary | Department of Energy Design of Experiment for Li-ion Cell Formation Parameters using ÂGen3ÂŽ Electrode Materials: Final Summary Statistical Design of Experiment for Li-ion Cell Formation Parameters using ÂGen3ÂŽ Electrode Materials: Final Summary 2009 DOE Hydrogen Program and Vehicle Technologies Program Annual Merit Review and Peer Evaluation Meeting, May 18-22, 2009 -- Washington D.C. PDF icon esp_03_gering.pdf More Documents &
Statistical Overview of 5 Years of HCCI Fuel and Engine Data from ORNL |
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Department of Energy Overview of 5 Years of HCCI Fuel and Engine Data from ORNL Statistical Overview of 5 Years of HCCI Fuel and Engine Data from ORNL Results show single fuel model could not represent all fuels studied but engine performance could be predicted with a grouped approach using cetane with secondary effects from volatility or heavy fuel components PDF icon deer10_bunting.pdf More Documents & Publications Response of Oil Sands Derived Fuels in Diesel HCCI Operation APBF
Guo, Genliang; George, S.A.; Lindsey, R.P.
1997-08-01
Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.
Statistical analysis of content of Cs-137 in soils in Bansko-Razlog region
Kobilarov, R. G.
2014-11-18
Statistical analysis of the data set consisting of the activity concentrations of {sup 137}Cs in soils in Bansko–Razlog region is carried out in order to establish the dependence of the deposition and the migration of {sup 137}Cs on the soil type. The descriptive statistics and the test of normality show that the data set have not normal distribution. Positively skewed distribution and possible outlying values of the activity of {sup 137}Cs in soils were observed. After reduction of the effects of outliers, the data set is divided into two parts, depending on the soil type. Test of normality of the two new data sets shows that they have a normal distribution. Ordinary kriging technique is used to characterize the spatial distribution of the activity of {sup 137}Cs over an area covering 40 km{sup 2} (whole Razlog valley). The result (a map of the spatial distribution of the activity concentration of {sup 137}Cs) can be used as a reference point for future studies on the assessment of radiological risk to the population and the erosion of soils in the study area.
Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis
Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel
2016-01-01
An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different case studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.
Dynamics and Statistical Mechanics of Rotating and non-Rotating Vortical Flows
Lim, Chjan
2013-12-18
Three projects were analyzed with the overall aim of developing a computational/analytical model for estimating values of the energy, angular momentum, enstrophy and total variation of fluid height at phase transitions between disordered and self-organized flow states in planetary atmospheres. It is believed that these transitions in equilibrium statistical mechanics models play a role in the construction of large-scale, stable structures including super-rotation in the Venusian atmosphere and the formation of the Great Red Spot on Jupiter. Exact solutions of the spherical energy-enstrophy models for rotating planetary atmospheres by Kac's method of steepest descent predicted phase transitions to super-rotating solid-body flows at high energy to enstrophy ratio for all planetary spins and to sub-rotating modes if the planetary spin is large enough. These canonical statistical ensembles are well-defined for the long-range energy interactions that arise from 2D fluid flows on compact oriented manifolds such as the surface of the sphere and torus. This is because in Fourier space available through Hodge theory, the energy terms are exactly diagonalizable and hence has zero range, leading to well-defined heat baths.
Statistical Behavior of Formation Process of Magnetic Vortex State in Ni80Fe20 Nanodisks
Im, Mi-Young; Fischer, Peter; Keisuke, Yamada; Kasai, Shinya
2011-01-14
Magnetic vortices in magnetic nanodots, which are characterized by an in-plane (chirality) and an out-of-plane (polarity) magnetizations, have been intensively attracted because of their high potential for technological application to data storage and memory scheme as well as their scientific interest for an understanding of fundamental physics in magnetic nanostructures. Complete understanding of the formation process of vortex state in magnetic vortex systems is very significant issue to achieve storage and memory technologies using magnetic vortices and understand intrinsic physical properties in magnetic nanostructures. In our work, we have statistically investigated the formation process of vortex state in permalloy (Py, Ni{sub 80}Fe{sub 20}) nanodisks through the direct observation of vortex structure utilizing a magnetic transmission soft X-ray microscopy (MTXM) with a high spatial resolution down to 20 nm. Magnetic imaging in Py nanodots was performed at the Fe L{sub 3} (707 eV) absorption edge. Figure 1 shows in-plane and out-of-plane magnetic components observed in 40 nm thick nanodot arrays with different dot radius of r = 500 and 400 nm, respectively. Vortex chirality, either clockwise (CW) or counter-clockwise (CCW), and polarity, either up or down, are clearly visible in both arrays. To investigate the statistical behavior in formation process of the vortex state, the observation of vortex structure at a remanant state after saturation of nanodots by an external magnetic field of 1 kOe has been repeatedly performed over 100 times for each array. The typical MTXM images of vortex chirality taken in two successive measurements together with their overlapped images in nanodot arrays of r = 500 and 400 nm are displayed in Fig. 2. Within the statistical measurement, the formation process of chirality of either CW or CCW is quite stochastic in each nanodot. Similar behavior is also witnessed in the formation of vortex polarity observed in consecutive experiments of the same arrays. Interestingly, a particular selectivity between the circulation sense of chirality and orientation sense of polarity for each other is found in the formation process of vortex state despite of their respective stochastic generation in repeated measurements. Dzyaloshinskii-Moriya (D-M) interaction in magnetic nanodisks, which is inevitably generated due to the breaking of inversion symmetry at surface/interface in magnetic thin layers, is mainly responsible for the experimentally witnessed selectivity between chirality and polarity in a formation of vortex structure.
Statistical modeling support for calibration of a multiphysics model of subcooled boiling flows
Bui, A. V.; Dinh, N. T.; Nourgaliev, R. R.; Williams, B. J.
2013-07-01
Nuclear reactor system analyses rely on multiple complex models which describe the physics of reactor neutronics, thermal hydraulics, structural mechanics, coolant physico-chemistry, etc. Such coupled multiphysics models require extensive calibration and validation before they can be used in practical system safety study and/or design/technology optimization. This paper presents an application of statistical modeling and Bayesian inference in calibrating an example multiphysics model of subcooled boiling flows which is widely used in reactor thermal hydraulic analysis. The presence of complex coupling of physics in such a model together with the large number of model inputs, parameters and multidimensional outputs poses significant challenge to the model calibration method. However, the method proposed in this work is shown to be able to overcome these difficulties while allowing data (observation) uncertainty and model inadequacy to be taken into consideration. (authors)
Beyth, M.; McInteer, C.; Broxton, D.E.; Bolivar, S.L.; Luke, M.E.
1980-06-01
Multivariate statistical analyses were carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Craig quadrangle, Colorado, to support the National Uranium Resource Evaluation and to evaluate strategic or other important commercial mineral resources. A few areas for favorable uranium mineralization are suggested for parts of the Wyoming Basin, Park Range, and Gore Range. Six potential source rocks for uranium are postulated based on factor score mapping. Vanadium in stream sediments is suggested as a pathfinder for carnotite-type mineralization. A probable northwest trend of lead-zinc-copper mineralization associated with Tertiary intrusions is suggested. A few locations are mapped where copper is associated with cobalt. Concentrations of placer sands containing rare earth elements, probably of commercial value, are indicated for parts of the Sand Wash Basin.
Rebound 2007: Analysis of U.S. Light-Duty Vehicle Travel Statistics
Greene, David L
2010-01-01
U.S. national time series data on vehicle travel by passenger cars and light trucks covering the period 1966 2007 are used to test for the existence, size and stability of the rebound effect for motor vehicle fuel efficiency on vehicle travel. The data show a statistically significant effect of gasoline price on vehicle travel but do not support the existence of a direct impact of fuel efficiency on vehicle travel. Additional tests indicate that fuel price effects have not been constant over time, although the hypothesis of symmetry with respect to price increases and decreases is not rejected. Small and Van Dender (2007) model of a declining rebound effect with income is tested and similar results are obtained.
Wave chaos in the stadium: Statistical properties of short-wave solutions of the Helmholtz equation
McDonald, S.W.; Kaufman, A.N.
1988-04-15
We numerically investigate statistical properties of short-wavelength normal modes and the spectrum for the Helmholtz equation in a two-dimensional stadium-shaped region. As the geometrical optics rays within this boundary (billiards) are nonintegrable, this wave problem serves as a simple model for the study of quantum chaos. The local spatial correlation function
Gevorgyan, T. V. [Institute for Physical Research, National Academy of Sciences, Ashtarak-2, 0203 Ashtarak (Armenia); Shahinyan, A. R. [Yerevan State University, A. Manoogian 1, 0025 Yerevan (Armenia); Kryuchkyan, G. Yu. [Institute for Physical Research, National Academy of Sciences, Ashtarak-2, 0203 Ashtarak (Armenia); Yerevan State University, A. Manoogian 1, 0025 Yerevan (Armenia)
2009-05-15
We show that quantum-interference phenomena can be realized for the dissipative nonlinear systems exhibiting hysteresis-cycle behavior and quantum chaos. Such results are obtained for a driven dissipative nonlinear oscillator with time-dependent parameters and take place for the regimes of long time intervals exceeding dissipation time and for macroscopic levels of oscillatory excitation numbers. Two schemas of time modulation, (i) periodic variation in the strength of the {chi}(3) nonlinearity; (ii) periodic modulation of the amplitude of the driving force, are considered. These effects are obtained within the framework of phase-space quantum distributions. It is demonstrated that the Wigner functions of oscillatory mode in both bistable and chaotic regimes acquire negative values and interference patterns in parts of phase-space due to appropriately time modulation of the oscillatory nonlinear dynamics. It is also shown that the time modulation of the oscillatory parameters essentially improves the degree of sub-Poissonian statistics of excitation numbers.
NOISY WEAK-LENSING CONVERGENCE PEAK STATISTICS NEAR CLUSTERS OF GALAXIES AND BEYOND
Fan Zuhui; Shan Huanyuan; Liu Jiayi
2010-08-20
Taking into account noise from intrinsic ellipticities of source galaxies, in this paper, we study the peak statistics in weak-lensing convergence maps around clusters of galaxies and beyond. We emphasize how the noise peak statistics is affected by the density distribution of nearby clusters, and also how cluster-peak signals are changed by the existence of noise. These are the important aspects to be thoroughly understood in weak-lensing analyses for individual clusters as well as in cosmological applications of weak-lensing cluster statistics. We adopt Gaussian smoothing with the smoothing scale {theta} {sub G} = 0.5arcmin in our analyses. It is found that the noise peak distribution near a cluster of galaxies sensitively depends on the density profile of the cluster. For a cored isothermal cluster with the core radius R{sub c} , the inner region with R {<=} R{sub c} appears noisy containing on average {approx}2.4 peaks with {nu} {>=} 5 for R{sub c} = 1.7arcmin and the true peak height of the cluster {nu} = 5.6, where {nu} denotes the convergence signal-to-noise ratio. For a Navarro-Frenk-White (NFW) cluster of the same mass and the same central {nu}, the average number of peaks with {nu} {>=} 5 within R {<=} R{sub c} is {approx}1.6. Thus a high peak corresponding to the main cluster can be identified more cleanly in the NFW case. In the outer region with R{sub c} < R {<=} 5R{sub c} , the number of high noise peaks is considerably enhanced in comparison with that of the pure noise case without the nearby cluster. For {nu} {>=} 4, depending on the treatment of the mass-sheet degeneracy in weak-lensing analyses, the enhancement factor f is in the range of {approx}5 to {approx}55 for both clusters as their outer density profiles are similar. The properties of the main-cluster-peak identified in convergence maps are also significantly affected by the presence of noise. Scatters as well as a systematic shift for the peak height are present. The height distribution is peaked at {nu} {approx} 6.6, rather than at {nu} = 5.6, corresponding to a shift of {Delta}{nu} {approx} 1, for the isothermal cluster. For the NFW cluster, {Delta}{nu} {approx} 0.8. The existence of noise also causes a location offset for the weak-lensing identified main-cluster-peak with respect to the true center of the cluster. The offset distribution is very broad and extends to R {approx} R{sub c} for the isothermal case. For the NFW cluster, it is relatively narrow and peaked at R {approx} 0.2R{sub c} . We also analyze NFW clusters of different concentrations. It is found that the more centrally concentrated the mass distribution of a cluster is, the less its weak-lensing signal is affected by noise. Incorporating these important effects and the mass function of NFW dark matter halos, we further present a model calculating the statistical abundances of total convergence peaks, true and false ones, over a large field beyond individual clusters. The results are in good agreement with those from numerical simulations. The model then allows us to probe cosmologies with the convergence peaks directly without the need of expensive follow-up observations to differentiate true and false peaks.
Mexico City air quality research initiative: An overview and some statistical aspects
Waller, R.A.; Streit, G.E. ); Guzman, F. )
1991-01-01
The Mexican Petroleum Institute (Institute Mexicano del Petroleo, IMP) and Los Alamos National Laboratory (LANL) are in the first year of a three-year jointly funded project to examine the air quality in Mexico City and to provide techniques to evaluate the impact of proposed mitigation options. The technical tasks include modeling and simulation; monitoring and characterization; and strategic evaluation. Extensive measurements of the atmosphere, climate, and meteorology are being made as part of the study. This presentation provides an overview of the total project plan, reports on the current status of the technical tasks, describes the data collection methods, presents examples of the data analysis and graphics, and suggest roles for statistical analysis in this and similar environmental studies. 8 figs., 4 tabs.
Statistical charge distribution over dust particles in a non-Maxwellian Lorentzian plasma
Mishra, S. K. [Institute for Plasma Research (IPR), Gandhinagar-382428 (India); Misra, Shikha, E-mail: shikhamish@gmail.com [Centre for Energy Studies (CES), Indian Institute of Technology Delhi (IITD), New Delhi-110016 (India)
2014-07-15
On the basis of statistical mechanics and charging kinetics, the charge distribution over uniform size spherical dust particles in a non-Maxwellian Lorentzian plasma is investigated. Two specific situations, viz., (i) the plasma in thermal equilibrium and (ii) non-equilibrium state where the plasma is dark (no emission) or irradiated by laser light (including photoemission) are taken into account. The formulation includes the population balance equation for the charged particles along with number and energy balance of the complex plasma constituents. The departure of the results for the Lorentzian plasma, from that in case of Maxwellian plasma, is graphically illustrated and discussed; it is shown that the charge distribution tends to results corresponding to Maxwellian plasma for large spectral index. The charge distribution predicts the opposite charging of the dust particles in certain cases.
Statistical Exploration of Electronic Structure of Molecules from Quantum Monte-Carlo Simulations
Prabhat, Mr; Zubarev, Dmitry; Lester, Jr., William A.
2010-12-22
In this report, we present results from analysis of Quantum Monte Carlo (QMC) simulation data with the goal of determining internal structure of a 3N-dimensional phase space of an N-electron molecule. We are interested in mining the simulation data for patterns that might be indicative of the bond rearrangement as molecules change electronic states. We examined simulation output that tracks the positions of two coupled electrons in the singlet and triplet states of an H2 molecule. The electrons trace out a trajectory, which was analyzed with a number of statistical techniques. This project was intended to address the following scientific questions: (1) Do high-dimensional phase spaces characterizing electronic structure of molecules tend to cluster in any natural way? Do we see a change in clustering patterns as we explore different electronic states of the same molecule? (2) Since it is hard to understand the high-dimensional space of trajectories, can we project these trajectories to a lower dimensional subspace to gain a better understanding of patterns? (3) Do trajectories inherently lie in a lower-dimensional manifold? Can we recover that manifold? After extensive statistical analysis, we are now in a better position to respond to these questions. (1) We definitely see clustering patterns, and differences between the H2 and H2tri datasets. These are revealed by the pamk method in a fairly reliable manner and can potentially be used to distinguish bonded and non-bonded systems and get insight into the nature of bonding. (2) Projecting to a lower dimensional subspace ({approx}4-5) using PCA or Kernel PCA reveals interesting patterns in the distribution of scalar values, which can be related to the existing descriptors of electronic structure of molecules. Also, these results can be immediately used to develop robust tools for analysis of noisy data obtained during QMC simulations (3) All dimensionality reduction and estimation techniques that we tried seem to indicate that one needs 4 or 5 components to account for most of the variance in the data, hence this 5D dataset does not necessarily lie on a well-defined, low dimensional manifold. In terms of specific clustering techniques, K-means was generally useful in exploring the dataset. The partition around medoids (pam) technique produced the most definitive results for our data showing distinctive patterns for both a sample of the complete data and time-series. The gap statistic with tibshirani criteria did not provide any distinction across the 2 dataset. The gap statistic w/DandF criteria, Model based clustering and hierarchical modeling simply failed to run on our datasets. Thankfully, the vanilla PCA technique was successful in handling our entire dataset. PCA revealed some interesting patterns for the scalar value distribution. Kernel PCA techniques (vanilladot, RBF, Polynomial) and MDS failed to run on the entire dataset, or even a significant fraction of the dataset, and we resorted to creating an explicit feature map followed by conventional PCA. Clustering using K-means and PAM in the new basis set seems to produce promising results. Understanding the new basis set in the scientific context of the problem is challenging, and we are currently working to further examine and interpret the results.
Table 1. Summary statistics for natural gas in the United States, 2010-2014
U.S. Energy Information Administration (EIA) Indexed Site
Table 1. Summary statistics for natural gas in the United States, 2010-2014 See footnotes at end of table. Number of Wells Producing at End of Year 487,627 514,637 482,822 R 484,994 514,786 Production (million cubic feet) Gross Withdrawals From Gas Wells 13,247,498 12,291,070 12,504,227 R 10,759,545 10,384,119 From Oil Wells 5,834,703 5,907,919 4,965,833 R 5,404,699 5,922,088 From Coalbed Wells 1,916,762 1,779,055 1,539,395 R 1,425,783 1,285,189 From Shale Gas Wells 5,817,122 8,500,983
Multiple-point statistical prediction on fracture networks at Yucca Mountain
Liu, X.Y; Zhang, C.Y.; Liu, Q.S.; Birkholzer, J.T.
2009-05-01
In many underground nuclear waste repository systems, such as at Yucca Mountain, water flow rate and amount of water seepage into the waste emplacement drifts are mainly determined by hydrological properties of fracture network in the surrounding rock mass. Natural fracture network system is not easy to describe, especially with respect to its connectivity which is critically important for simulating the water flow field. In this paper, we introduced a new method for fracture network description and prediction, termed multi-point-statistics (MPS). The process of the MPS method is to record multiple-point statistics concerning the connectivity patterns of a fracture network from a known fracture map, and to reproduce multiple-scale training fracture patterns in a stochastic manner, implicitly and directly. It is applied to fracture data to study flow field behavior at the Yucca Mountain waste repository system. First, the MPS method is used to create a fracture network with an original fracture training image from Yucca Mountain dataset. After we adopt a harmonic and arithmetic average method to upscale the permeability to a coarse grid, THM simulation is carried out to study near-field water flow in the surrounding waste emplacement drifts. Our study shows that connectivity or patterns of fracture networks can be grasped and reconstructed by MPS methods. In theory, it will lead to better prediction of fracture system characteristics and flow behavior. Meanwhile, we can obtain variance from flow field, which gives us a way to quantify model uncertainty even in complicated coupled THM simulations. It indicates that MPS can potentially characterize and reconstruct natural fracture networks in a fractured rock mass with advantages of quantifying connectivity of fracture system and its simulation uncertainty simultaneously.
DYNAMIC STABILITY OF THE SOLAR SYSTEM: STATISTICALLY INCONCLUSIVE RESULTS FROM ENSEMBLE INTEGRATIONS
Zeebe, Richard E.
2015-01-01
Due to the chaotic nature of the solar system, the question of its long-term stability can only be answered in a statistical sense, for instance, based on numerical ensemble integrations of nearby orbits. Destabilization of the inner planets, leading to close encounters and/or collisions can be initiated through a large increase in Mercury's eccentricity, with a currently assumed likelihood of ?1%. However, little is known at present about the robustness of this number. Here I report ensemble integrations of the full equations of motion of the eight planets and Pluto over 5 Gyr, including contributions from general relativity. The results show that different numerical algorithms lead to statistically different results for the evolution of Mercury's eccentricity (e{sub M}). For instance, starting at present initial conditions (e{sub M}?0.21), Mercury's maximum eccentricity achieved over 5 Gyr is, on average, significantly higher in symplectic ensemble integrations using heliocentric rather than Jacobi coordinates and stricter error control. In contrast, starting at a possible future configuration (e{sub M}?0.53), Mercury's maximum eccentricity achieved over the subsequent 500 Myr is, on average, significantly lower using heliocentric rather than Jacobi coordinates. For example, the probability for e{sub M} to increase beyond 0.53 over 500 Myr is >90% (Jacobi) versus only 40%-55% (heliocentric). This poses a dilemma because the physical evolution of the real system—and its probabilistic behavior—cannot depend on the coordinate system or the numerical algorithm chosen to describe it. Some tests of the numerical algorithms suggest that symplectic integrators using heliocentric coordinates underestimate the odds for destabilization of Mercury's orbit at high initial e{sub M}.
Brady, Samuel L.; Shulkin, Barry L.
2015-02-15
Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (10–35 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV{sub bw}) of various diameter targets (range 8–37 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV{sub bw}, background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.3–0.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake.
Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties
Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald; Lee, Andrew K.; Sahoo, Narayan; Tucker, Susan L.; Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E.; Dong, Lei
2013-08-01
Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dose–volume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dose–volume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of ?1.1% (?0.9% for breath-hold), ?0.3%, and ?2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.
Frome, EL
2005-09-20
Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.
Ringrose, P.; Pickup, G.; Jensen, J.
1997-08-01
We have used a reservoir gridblock-sized outcrop (10m by 100m) of fluvio-deltaic sandstones to evaluate the importance of internal heterogeneity for a hypothetical waterflood displacement process. Using a dataset based on probe permeameter measurements taken from two vertical transacts representing {open_quotes}wells{close_quotes} (5cm sampling) and one {open_quotes}core{close_quotes} sample (exhaustive 1mm-spaced sampling), we evaluate the permeability variability at different lengthscales, the correlation characteristics (structure of the variogram, function), and larger-scale trends. We then relate these statistical measures to the sedimentology. We show how the sediment architecture influences the effective tensor permeability at the lamina and bed scale, and then calculate the effective relative permeability functions for a waterflood. We compare the degree of oil recovery from the formation: (a) using averaged borehole data and no geological structure, and (b) modelling the sediment architecture of the interwell volume using mixed stochastic/deterministic methods. We find that the sediment architecture has an important effect on flow performance, mainly due to bedscale capillary trapping and a consequent reduction in the effective oil mobility. The predicted oil recovery differs by 18% when these small-scale effects are included in the model. Traditional reservoir engineering methods, using averages permeability values, only prove acceptable in high-permeability and low-heterogeneity zones. The main outstanding challenge, represented by this illustration of sub-gridblock scale heterogeneity, is how to capture the relevant geological structure along with the inherent geo-statistical variability. An approach to this problem is proposed.
Baumgartner, S.; Bieli, R.; Bergmann, U. C.
2012-07-01
An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This is considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)
NREL Releases Renewable Energy Data Book Detailing Growing Industry in 2012
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
- News Releases | NREL Releases Renewable Energy Data Book Detailing Growing Industry in 2012 November 21, 2013 The National Renewable Energy Laboratory (NREL) has released the 2012 Renewable Energy Data Book on behalf of the Energy Department's Office of Energy Efficiency and Renewable Energy. The annual report is an important assessment of U.S. energy statistics for 2012, including renewable electricity, worldwide renewable energy development, clean energy investments, and data on specific
LyMAS: Predicting large-scale Ly? forest statistics from the dark matter density field
Peirani, Sébastien; Colombi, Stéphane; Dubois, Yohan; Pichon, Christophe; Weinberg, David H.; Blaizot, Jérémy
2014-03-20
We describe Ly? Mass Association Scheme (LyMAS), a method of predicting clustering statistics in the Ly? forest on large scales from moderate-resolution simulations of the dark matter (DM) distribution, with calibration from high-resolution hydrodynamic simulations of smaller volumes. We use the 'Horizon-MareNostrum' simulation, a 50 h {sup –1} Mpc comoving volume evolved with the adaptive mesh hydrodynamic code RAMSES, to compute the conditional probability distribution P(F{sub s} |? {sub s}) of the transmitted flux F{sub s} , smoothed (one-dimensionally, 1D) over the spectral resolution scale, on the DM density contrast ? {sub s}, smoothed (three-dimensionally, 3D) over a similar scale. In this study we adopt the spectral resolution of the SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS) at z = 2.5, and we find optimal results for a DM smoothing length ? = 0.3 h {sup –1} Mpc (comoving). In its simplest form, LyMAS draws randomly from the hydro-calibrated P(F{sub s} |? {sub s}) to convert DM skewers into Ly? forest pseudo-spectra, which are then used to compute cross-sightline flux statistics. In extended form, LyMAS exactly reproduces both the 1D power spectrum and one-point flux distribution of the hydro simulation spectra. Applied to the MareNostrum DM field, LyMAS accurately predicts the two-point conditional flux distribution and flux correlation function of the full hydro simulation for transverse sightline separations as small as 1 h {sup –1} Mpc, including redshift-space distortion effects. It is substantially more accurate than a deterministic density-flux mapping ({sup F}luctuating Gunn-Peterson Approximation{sup )}, often used for large-volume simulations of the forest. With the MareNostrum calibration, we apply LyMAS to 1024{sup 3} N-body simulations of a 300 h {sup –1} Mpc and 1.0 h {sup –1} Gpc cube to produce large, publicly available catalogs of mock BOSS spectra that probe a large comoving volume. LyMAS will be a powerful tool for interpreting 3D Ly? forest data, thereby transforming measurements from BOSS and other massive quasar absorption surveys into constraints on dark energy, DM, space geometry, and intergalactic medium physics.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.
2009-01-01
Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmoreÂ Â» to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.Â«Â less
Mortality in Appalachian coal mining regions: the value of statistical life lost
Hendryx, M.; Ahern, M.M.
2009-07-15
We examined elevated mortality rates in Appalachian coal mining areas for 1979-2005, and estimated the corresponding value of statistical life (VSL) lost relative to the economic benefits of the coal mining industry. We compared age-adjusted mortality rates and socioeconomic conditions across four county groups: Appalachia with high levels of coal mining, Appalachia with lower mining levels, Appalachia without coal mining, and other counties in the nation. We converted mortality estimates to VSL estimates and compared the results with the economic contribution of coal mining. We also conducted a discount analysis to estimate current benefits relative to future mortality costs. The heaviest coal mining areas of Appalachia had the poorest socioeconomic conditions. Before adjusting for covariates, the number of excess annual age-adjusted deaths in coal mining areas ranged from 3,975 to 10,923, depending on years studied and comparison group. Corresponding VSL estimates ranged from $18.563 billion to $84.544 billion, with a point estimate of $50.010 billion, greater than the $8.088 billion economic contribution of coal mining. After adjusting for covariates, the number of excess annual deaths in mining areas ranged from 1,736 to 2,889, and VSL costs continued to exceed the benefits of mining. Discounting VSL costs into the future resulted in excess costs relative to benefits in seven of eight conditions, with a point estimate of $41.846 billion.
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Penarrubia, Jorge; Walker, Matthew G.
2012-11-20
We introduce the Minimum Entropy Method, a simple statistical technique for constraining the Milky Way gravitational potential and simultaneously testing different gravity theories directly from 6D phase-space surveys and without adopting dynamical models. We demonstrate that orbital energy distributions that are separable (i.e., independent of position) have an associated entropy that increases under wrong assumptions about the gravitational potential and/or gravity theory. Of known objects, 'cold' tidal streams from low-mass progenitors follow orbital distributions that most nearly satisfy the condition of separability. Although the orbits of tidally stripped stars are perturbed by the progenitor's self-gravity, systematic variations of the energy distribution can be quantified in terms of the cross-entropy of individual tails, giving further sensitivity to theoretical biases in the host potential. The feasibility of using the Minimum Entropy Method to test a wide range of gravity theories is illustrated by evolving restricted N-body models in a Newtonian potential and examining the changes in entropy introduced by Dirac, MONDian, and f(R) gravity modifications.
Statistically designed study of the variables and parameters of carbon dioxide equations of state
Donohue, M.D.; Naiman, D.Q.; Jin, Gang; Loehe, J.R.
1991-05-01
Carbon dioxide is used widely in enhanced oil recovery (EOR) processes to maximize the production of crude oil from aging and nearly depleted oil wells. Carbon dioxide also is encountered in many processes related to oil recovery. Accurate representations of the properties of carbon dioxide, and its mixtures with hydrocarbons, play a critical role in a number of enhanced oil recovery operations. One of the first tasks of this project was to select an equation of state to calculate the properties of carbon dioxide and its mixtures. The equations simplicity, accuracy, and reliability in representing phase behavior and thermodynamic properties of mixtures containing carbon dioxide with hydrocarbons at conditions relevant to enhanced oil recovery were taken into account. We also have determined the thermodynamic properties that are important to enhanced oil recovery and the ranges of temperature, pressure and composition that are important. We chose twelve equations of state for preliminary studies to be evaluated against these criteria. All of these equations were tested for pure carbon dioxide and eleven were tested for pure alkanes and their mixtures with carbon dioxide. Two equations, the ALS equation and the ESD equation, were selected for detailed statistical analysis. 54 refs., 41 figs., 36 tabs.
Statistical Analysis of Microarray Data with Replicated Spots: A Case Study withSynechococcusWH8102
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.
2009-01-01
Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmore »to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.« less
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.
2015-08-19
Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemoreÂ Â» inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.Â«Â less
Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.
2015-08-19
Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidate inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.
Analysis of hyper-spectral data derived from an imaging Fourier transform: A statistical perspective
Sengupta, S.K.; Clark, G.A.; Fields, D.J.
1996-01-10
Fourier transform spectrometers (FTS) using optical sensors are increasingly being used in various branches of science. Typically, a FTS generates a three-dimensional data cube with two spatial dimensions and one frequency/wavelength dimension. The number of frequency dimensions in such data cubes is generally very large, often in the hundreds, making data analytical procedures extremely complex. In the present report, the problem is viewed from a statistical perspective. A set of procedures based on the high degree of inter-channel correlation structure often present in such hyper-spectral data, has been identified and applied to an example data set of dimension 100 x 128 x 128 comprising 128 spectral bands. It is shown that in this case, the special eigen-structure of the correlation matrix has allowed the authors to extract just a few linear combinations of the channels (the significant principal vectors) that effectively contain almost all of the spectral information contained in the data set analyzed. This in turn, enables them to segment the objects in the given spatial frame using, in a parsimonious yet highly effective way, most of the information contained in the data set.
Statistics of particle time-temperature histories : progress report for June 2013.
Hewson, John C.; Gin, Craig; Lignell, David O. [Brigham Young University Provo, UT; Sun, Guangyuan [Brigham Young University Provo, UT
2013-10-01
Progress toward predictions of the statistics of particle time-temperature histories is presented. These predictions are to be made using Lagrangian particle models within the one-dimensional turbulence (ODT) model. In the present reporting period we have further characterized the performance, behavior and capabilities of the particle dispersion models that were added to the ODT model in the first period. We have also extended the capabilities in two manners. First we provide alternate implementations of the particle transport process within ODT; within this context the original implementation is referred to as the type-I and the new implementations are referred to as the type-C and type-IC interactions. Second we have developed and implemented models for two-way coupling between the particle and fluid phase. This allows us to predict the reduced rate of turbulent mixing associated with particle dissipation of energy and similar phenomena. Work in characterizing these capabilities has taken place in homogeneous decaying turbulence, in free shear layers, in jets and in channel flow with walls, and selected results are presented.
A statistical study of the macroepidemiology of air pollution and total mortality
Lipfert, F.W.; Malone, R.G.; Daum, M.L.; Mendell, N.R.; Yang, Chin-Chun
1988-04-01
A statistical analysis of spatial patterns of 1980 US urban total mortality (all causes) was performed, evaluating demographic, socioeconomic and air pollution factors as predictors. Specific mortality predictors included cigarette smoking, drinking water hardness, heating fuel use, and 1978-1982 annual concentrations of the following air pollutants: ozone, carbon monoxide, sulfate aerosol, particulate concentrations of lead, iron, cadmium, manganese, vanadium, as well as total and fine particle mass concentrations from the inhalable particulate network (dichotomous samplers). In addition, estimates of sulfur dioxide, oxides of nitrogen, and sulfate aerosol were made for each city using the ASTRAP long-range transport diffusion model, and entered into the analysis as independent variables. Because the number of cities with valid air quality and water hardness data varied considerably by pollutant, it was necessary to consider several different data sets, ranging from 48 to 952 cities. The relatively strong associations (ca. 5--10%) shown for 1980 pollution with 1980 total mortality are generally not confirmed by independent studies, for example, in Europe. In addition, the US studies did not find those pollutants with known adverse health effects at the concentrations in question (such as ozone or CO) to be associated with mortality. The question of causality vs. circumstantial association must therefore be regarded as still unresolved. 59 refs., 20 figs., 40 tabs.
Statistics of reversible transitions in two-state trajectories in force-ramp spectroscopy
Diezemann, Gregor
2014-05-14
A possible way to extract information about the reversible dissociation of a molecular adhesion bond from force fluctuations observed in force ramp experiments is discussed. For small loading rates the system undergoes a limited number of unbinding and rebinding transitions observable in the so-called force versus extension (FE) curves. The statistics of these transient fluctuations can be utilized to estimate the parameters for the rebinding rate. This is relevant in the experimentally important situation where the direct observation of the reversed FE-curves is hampered, e.g., due to the presence of soft linkers. I generalize the stochastic theory of the kinetics in two-state models to the case of time-dependent kinetic rates and compute the relevant distributions of characteristic forces. While for irreversible systems there is an intrinsic relation between the rupture force distribution and the population of the free-energy well of the bound state, the situation is slightly more complex if reversible systems are considered. For a two-state model, a “stationary” rupture force distribution that is proportional to the population can be defined and allows to consistently discuss quantities averaged over the transient fluctuations. While irreversible systems are best analyzed in the soft spring limit of small pulling device stiffness and large loading rates, here I argue to use the stiffness of the pulling device as a control parameter in addition to the loading rate.
Financial statistics of major U.S. publicly owned electric utilities 1995
1997-07-01
The 1995 Edition of the Financial Statistics of Major U.S. Publicly Owned Electric Utilities publication presents 5 years (1991 through 1995) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to publicly owned electric utility issues. Generator (Tables 3 through 11) and nongenerator (Tables 12 through 20) summaries are presented in this publication. Five years of summary financial data are provided (Tables 5 through 11 and 14 through 20). Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided in Appendix C. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. 9 figs., 87 tabs.
Bein, B. M.; Berkebile-Stoiser, S.; Veronig, A. M.; Temmer, M.; Muhr, N.; Kienreich, I.; Utz, D.
2011-09-10
We use high time cadence images acquired by the STEREO EUVI and COR instruments to study the evolution of coronal mass ejections (CMEs) from their initiation through impulsive acceleration to the propagation phase. For a set of 95 CMEs we derived detailed height, velocity, and acceleration profiles and statistically analyzed characteristic CME parameters: peak acceleration, peak velocity, acceleration duration, initiation height, height at peak velocity, height at peak acceleration, and size of the CME source region. The CME peak accelerations we derived range from 20 to 6800 m s{sup -2} and are inversely correlated with the acceleration duration and the height at peak acceleration. Seventy-four percent of the events reach their peak acceleration at heights below 0.5 R{sub sun}. CMEs that originate from compact sources low in the corona are more impulsive and reach higher peak accelerations at smaller heights. These findings can be explained by the Lorentz force, which drives the CME accelerations and decreases with height and CME size.
Eberhardt, L.L.; Thomas, J.M.
1986-07-01
This project was designed to develop guidance for implementing 10 CFR Part 61 and to determine the overall needs for sampling and statistical work in characterizing, surveying, monitoring, and closing commercial low-level waste sites. When cost-effectiveness and statistical reliability are of prime importance, then double sampling, compositing, and stratification (with optimal allocation) are identified as key issues. If the principal concern is avoiding questionable statistical practice, then the applicability of kriging (for assessing spatial pattern), methods for routine monitoring, and use of standard textbook formulae in reporting monitoring results should be reevaluated. Other important issues identified include sampling for estimating model parameters and the use of data from left-censored (less than detectable limits) distributions.
Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.; Amidan, Brett G.
2013-04-27
This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0 3. quantitative data (e.g., contaminant concentrations expressed as CFU/cm2) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 4. quantitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0. For Situation 2, the hotspot sampling approach provides for stating with Z% confidence that a hotspot of specified shape and size with detectable contamination will be found. Also for Situation 2, the CJR approach provides for stating with X% confidence that at least Y% of the decision area does not contain detectable contamination. Forms of these statements for the other three situations are discussed in Section 2.2. Statistical methods that account for FNR > 0 currently only exist for the hotspot sampling approach with qualitative data (or quantitative data converted to qualitative data). This report documents the current status of methods and formulas for the hotspot and CJR sampling approaches. Limitations of these methods are identified. Extensions of the methods that are applicable when FNR = 0 to account for FNR > 0, or to address other limitations, will be documented in future revisions of this report if future funding supports the development of such extensions. For quantitative data, this report also presents statistical methods and formulas for 1. quantifying the uncertainty in measured sample results 2. estimating the true surface concentration corresponding to a surface sample 3. quantifying the uncertainty of the estimate of the true surface concentration. All of the methods and formulas discussed in the report were applied to example situations to illustrate application of the methods and interpretation of the results.
Fossil fuel potential of Turkey: A statistical evaluation of reserves, production, and consumption
Korkmaz, S.; Kara-Gulbay, R.; Turan, M.
2008-07-01
Since Turkey is a developing country with tremendous economic growth, its energy demand is also getting increased. Of this energy, about 70% is supplied from fossil fuels and the remaining 30% is from renewable sources. Among the fossil fuels, 90% of oil, natural gas, and coal are imported, and only 10% is from domestic sources. All the lignite is supplied from domestic sources. The total share of renewable sources and lignite in the total energy production is 45%. In order for Turkey to have sufficient and reliable energy sources, first the renewable energy sources must be developed, and energy production from fossil fuels, except for lignite, must be minimized. Particularly, scarcity of fossil fuels and increasing oil prices have a strong effect on economic growth of the country.
Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios
Brunsell, Nathaniel; Mechem, David; Ma, Chunsheng
2015-02-20
Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the validity of an innovative multi–resolution information theory approach, and the ability of the RCM modeling framework to represent the low-frequency modulation of extreme climate events. Once the skill of the modeling and analysis methodology has been established, we will apply the same approach for the AR5 (IPCC Fifth Assessment Report) climate change scenarios in order to assess how climate extremes and the the influence of lowfrequency variability on climate extremes might vary under changing climate. The research specifically addresses the DOE focus area 2. Simulation of climate extremes under a changing climate. Specific results will include (1) a better understanding of the spatial and temporal structure of extreme events, (2) a thorough quantification of how extreme values are impacted by low-frequency climate teleconnections, (3) increased knowledge of current regional climate models ability to ascertain these influences, and (4) a detailed examination of the how the distribution of extreme events are likely to change under different climate change scenarios. In addition, this research will assess the ability of the innovative wavelet information theory approach to characterize extreme events. Any and all of these results will greatly enhance society’s ability to understand and mitigate the regional ramifications of future global climate change.
SUPERFLARES ON SOLAR-TYPE STARS OBSERVED WITH KEPLER. I. STATISTICAL PROPERTIES OF SUPERFLARES
Shibayama, Takuya; Notsu, Shota; Notsu, Yuta; Nagao, Takashi [Department of Astronomy, Kyoto University, Sakyo, Kyoto 606-8502 (Japan); Maehara, Hiroyuki; Honda, Satoshi; Ishii, Takako T.; Nogami, Daisaku; Shibata, Kazunari, E-mail: shibayama@kwasan.kyoto-u.ac.jp [Kwasan and Hida Observatory, Kyoto University, Yamashina, Kyoto 607-8471 (Japan)
2013-11-01
By extending our previous study by Maehara et al., we searched for superflares on G-type dwarfs (solar-type stars) using Kepler data for a longer period (500 days) than that (120 days) in our previous study. As a result, we found 1547 superflares on 279 G-type dwarfs, which is much more than the previous 365 superflares on 148 stars. Using these new data, we studied the statistical properties of the occurrence rate of superflares, and confirmed the previous results, i.e., the occurrence rate (dN/dE) of superflares versus flare energy (E) shows a power-law distribution with dN/dE?E {sup –?}, where ? ? 2. It is interesting that this distribution is roughly similar to that for solar flares. In the case of the Sun-like stars (with surface temperature 5600-6000 K and slowly rotating with a period longer than 10 days), the occurrence rate of superflares with an energy of 10{sup 34}-10{sup 35} erg is once in 800-5000 yr. We also studied long-term (500 days) stellar brightness variation of these superflare stars and found that in some G-type dwarfs the occurrence rate of superflares was extremely high, ?57 superflares in 500 days (i.e., once in 10 days). In the case of Sun-like stars, the most active stars show a frequency of one superflare (with 10{sup 34} erg) in 100 days. There is evidence that these superflare stars have extremely large starspots with a size about 10 times larger than that of the largest sunspot. We argue that the physical origin of the extremely high occurrence rate of superflares in these stars may be attributed to the existence of extremely large starspots.
Akahori, Takuya; Gaensler, B. M.; Ryu, Dongsu E-mail: bryan.gaensler@sydney.edu.au
2014-08-01
Rotation measure (RM) grids of extragalactic radio sources have been widely used for studying cosmic magnetism. However, their potential for exploring the intergalactic magnetic field (IGMF) in filaments of galaxies is unclear, since other Faraday-rotation media such as the radio source itself, intervening galaxies, and the interstellar medium of our Galaxy are all significant contributors. We study statistical techniques for discriminating the Faraday rotation of filaments from other sources of Faraday rotation in future large-scale surveys of radio polarization. We consider a 30° × 30° field of view toward the south Galactic pole, while varying the number of sources detected in both present and future observations. We select sources located at high redshifts and toward which depolarization and optical absorption systems are not observed so as to reduce the RM contributions from the sources and intervening galaxies. It is found that a high-pass filter can satisfactorily reduce the RM contribution from the Galaxy since the angular scale of this component toward high Galactic latitudes would be much larger than that expected for the IGMF. Present observations do not yet provide a sufficient source density to be able to estimate the RM of filaments. However, from the proposed approach with forthcoming surveys, we predict significant residuals of RM that should be ascribable to filaments. The predicted structure of the IGMF down to scales of 0.°1 should be observable with data from the Square Kilometre Array, if we achieve selections of sources toward which sightlines do not contain intervening galaxies and RM errors are less than a few rad m{sup –2}.
Cirrus clouds in a global climate model with a statistical cirrus cloud scheme
Wang, Minghuai; Penner, Joyce E.
2010-06-21
A statistical cirrus cloud scheme that accounts for mesoscale temperature perturbations is implemented in a coupled aerosol and atmospheric circulation model to better represent both subgrid-scale supersaturation and cloud formation. This new scheme treats the effects of aerosol on cloud formation and ice freezing in an improved manner, and both homogeneous freezing and heterogeneous freezing are included. The scheme is able to better simulate the observed probability distribution of relative humidity compared to the scheme that was implemented in an older version of the model. Heterogeneous ice nuclei (IN) are shown to decrease the frequency of occurrence of supersaturation, and improve the comparison with observations at 192 hPa. Homogeneous freezing alone can not reproduce observed ice crystal number concentrations at low temperatures (<205 K), but the addition of heterogeneous IN improves the comparison somewhat. Increases in heterogeneous IN affect both high level cirrus clouds and low level liquid clouds. Increases in cirrus clouds lead to a more cloudy and moist lower troposphere with less precipitation, effects which we associate with the decreased convective activity. The change in the net cloud forcing is not very sensitive to the change in ice crystal concentrations, but the change in the net radiative flux at the top of the atmosphere is still large because of changes in water vapor. Changes in the magnitude of the assumed mesoscale temperature perturbations by 25% alter the ice crystal number concentrations and the net radiative fluxes by an amount that is comparable to that from a factor of 10 change in the heterogeneous IN number concentrations. Further improvements on the representation of mesoscale temperature perturbations, heterogeneous IN and the competition between homogeneous freezing and heterogeneous freezing are needed.
Statistical Comparison of the Baseline Mechanical Properties of NBG-18 and PCEA Graphite
Mark C. Carroll; David T. Rohrbaugh
2013-08-01
High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR), a graphite-moderated, helium-cooled design that is capable of producing process heat for power generation and for industrial process that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties by providing comprehensive data that captures the level of variation in measured values. In addition to providing a comprehensive comparison between these values in different nuclear grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons and variations between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between the two grades of graphite that were initially favored in the two main VHTR designs. NBG-18, a medium-grain pitch coke graphite from SGL formed via vibration molding, was the favored structural material in the pebble-bed configuration, while PCEA, a smaller grain, petroleum coke, extruded graphite from GrafTech was favored for the prismatic configuration. An analysis of the comparison between these two grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.
High Statistics Analysis using Anisotropic Clover Lattices: (I) Single Hadron Correlation Functions
Beane, S; Detmold, W; Luu, T; Orginos, K; Parreno, A; Savage, M; Torok, A; Walker-Loud, A
2009-03-23
We present the results of high-statistics calculations of correlation functions generated with single-baryon interpolating operators on an ensemble of dynamical anisotropic gauge-field configurations generated by the Hadron Spectrum Collaboration using a tadpole-improved clover fermion action and Symanzik-improved gauge action. A total of 292, 500 sets of measurements are made using 1194 gauge configurations of size 20{sup 3} x 128 with an anisotropy parameter {zeta} = b{sub s}/b{sub t} = 3.5, a spatial lattice spacing of b{sub s} = 0.1227 {+-} 0.0008 fm, and pion mass of M{sub {pi}} {approx} 390 MeV. Ground state baryons masses are extracted with fully quantified uncertainties that are at or below the {approx} 0.2%-level in lattice units. The lowest-lying negative-parity states are also extracted albeit with a somewhat lower level of precision. In the case of the nucleon, this negative-parity state is above the N{pi} threshold and, therefore, the isospin-1/2 {pi}N s-wave scattering phase-shift can be extracted using Luescher's method. The disconnected contributions to this process are included indirectly in the gauge-field configurations and do not require additional calculations. The signal-to-noise ratio in the various correlation functions is explored and is found to degrade exponentially faster than naive expectations on many time-slices. This is due to backward propagating states arising from the anti-periodic boundary conditions imposed on the quark-propagators in the time-direction. We explore how best to distribute computational resources between configuration generation and propagator measurements in order to optimize the extraction of single baryon observables.
Financial statistics of major U.S. publicly owned electric utilities 1997
1998-12-01
The 1997 edition of the ``Financial Statistics of Major U.S. Publicly Owned Electric Utilities`` publication presents 5 years (1993 through 1997) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to publicly owned electric utility issues. Generator (Tables 3 through 11) and nongenerator (Tables 12 through 20) summaries are presented in this publication. Five years of summary financial data are provided (Tables 5 through 11 and 14 through 20). Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided in Appendix C. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, operating revenue, and electric energy account data. The primary source of publicly owned financial data is the Form EIA-412, ``Annual Report of Public Electric Utilities.`` Public electric utilities file this survey on a fiscal year basis, in conformance with their recordkeeping practices. The EIA undertook a review of the Form EIA-412 submissions to determine if alternative classifications of publicly owned electric utilities would permit the inclusion of all respondents. The review indicated that financial indicators differ most according to whether or not a publicly owned electric utility generates electricity. Therefore, the main body of the report provides summary information in generator/nongenerator classifications. 2 figs., 101 tabs.
STATISTICAL STUDY OF CHROMOSPHERIC ANEMONE JETS OBSERVED WITH HINODE/SOT
Nishizuka, N.; Nakamura, T.; Kawate, T.; Singh, K. A. P.; Shibata, K.
2011-04-10
The Solar Optical Telescope on board Hinode has revealed numerous tiny jets in all regions of the chromosphere outside of sunspots. A typical chromospheric anemone jet has a cusp-shaped structure and bright footpoint, similar to the shape of an X-ray anemone jet observed previously with the Soft X-ray Telescope on board Yohkoh. The similarity in the shapes of chromospheric and X-ray anemone jets suggests that chromospheric anemone jets are produced as a result of the magnetic reconnection between a small bipole (perhaps a tiny emerging flux) and a pre-existing uniform magnetic field in the lower chromosphere. We examine various chromospheric anemone jets in the solar active region near the solar limb and study the typical features (e.g., length, width, lifetime, and velocity) of the chromospheric anemone jets. Statistical studies show that chromospheric anemone jets have: (1) a typical length {approx}1.0-4.0 Mm, (2) a width {approx}100-400 km, (3) a lifetime {approx}100-500 s, and (4) a velocity {approx}5-20 km s{sup -1}. The velocity of the chromospheric anemone jets is comparable to the local Alfven speed in the lower solar chromosphere ({approx}10 km s{sup -1}). The histograms of chromospheric anemone jets near the limb and near the disk center show similar averages and shapes of distributions, suggesting that the characteristic behavior of chromospheric anemone jets is independent of whether they are observed on the disk or at the limb. The observed relationship between the velocity and length of chromospheric anemone jets shows that the jets do not follow ballistic motion but are more likely accelerated by some other mechanism. This is consistent with numerical simulations of chromospheric anemone jets.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Jasper, Ahren
2015-04-14
The appropriateness of treating crossing seams of electronic states of different spins as nonadiabatic transition states in statistical calculations of spin-forbidden reaction rates is considered. We show that the spin-forbidden reaction coordinate, the nuclear coordinate perpendicular to the crossing seam, is coupled to the remaining nuclear degrees of freedom. We found that this coupling gives rise to multidimensional effects that are not typically included in statistical treatments of spin-forbidden kinetics. Three qualitative categories of multidimensional effects may be identified: static multidimensional effects due to the geometry-dependence of the local shape of the crossing seam and of the spinâ€“orbit coupling, dynamicalmoreÂ Â» multidimensional effects due to energy exchange with the reaction coordinate during the seam crossing, and nonlocal(history-dependent) multidimensional effects due to interference of the electronic variables at second, third, and later seam crossings. Nonlocal multidimensional effects are intimately related to electronic decoherence, where electronic dephasing acts to erase the history of the system. A semiclassical model based on short-time full-dimensional trajectories that includes all three multidimensional effects as well as a model for electronic decoherence is presented. The results of this multidimensional nonadiabatic statistical theory (MNST) for the 3O + CO â†’ CO2 reaction are compared with the results of statistical theories employing one-dimensional (Landauâ€“Zener and weak coupling) models for the transition probability and with those calculated previously using multistate trajectories. The MNST method is shown to accurately reproduce the multistate decay-of-mixing trajectory results, so long as consistent thresholds are used. Furthermore, the MNST approach has several advantages over multistate trajectory approaches and is more suitable in chemical kinetics calculations at low temperatures and for complex systems. The error in statistical calculations that neglect multidimensional effects is shown to be as large as a factor of 2 for this system, with static multidimensional effects identified as the largest source of error.Â«Â less
GREER DA; THIEN MG
2012-01-12
The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste Treatment and Immobilization Plant (WTP) presents a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. DOE's Tank Operations Contractor, Washington River Protection Solutions (WRPS) has previously presented the results of mixing performance in two different sizes of small scale DSTs to support scale up estimates of full scale DST mixing performance. Currently, sufficient sampling of DSTs is one of the largest programmatic risks that could prevent timely delivery of high level waste to the WTP. WRPS has performed small scale mixing and sampling demonstrations to study the ability to sufficiently sample the tanks. The statistical evaluation of the demonstration results which lead to the conclusion that the two scales of small DST are behaving similarly and that full scale performance is predictable will be presented. This work is essential to reduce the risk of requiring a new dedicated feed sampling facility and will guide future optimization work to ensure the waste feed delivery mission will be accomplished successfully. This paper will focus on the analytical data collected from mixing, sampling, and batch transfer testing from the small scale mixing demonstration tanks and how those data are being interpreted to begin to understand the relationship between samples taken prior to transfer and samples from the subsequent batches transferred. An overview of the types of data collected and examples of typical raw data will be provided. The paper will then discuss the processing and manipulation of the data which is necessary to begin evaluating sampling and batch transfer performance. This discussion will also include the evaluation of the analytical measurement capability with regard to the simulant material used in the demonstration tests. The paper will conclude with a discussion of the analysis results illustrating the relationship between the pre-transfer samples and the batch transfers, which support the recommendation regarding the need for a dedicated feed sampling facility.
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan
2014-03-15
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.
Statistical analysis of liquid seepage in partially saturated heterogeneous fracture systems
Liou, T.S.
1999-12-01
Field evidence suggests that water flow in unsaturated fracture systems may occur along fast preferential flow paths. However, conventional macroscale continuum approaches generally predict the downward migration of water as a spatially uniform wetting front subjected to strong inhibition into the partially saturated rock matrix. One possible cause of this discrepancy may be the spatially random geometry of the fracture surfaces, and hence, the irregular fracture aperture. Therefore, a numerical model was developed in this study to investigate the effects of geometric features of natural rock fractures on liquid seepage and solute transport in 2-D planar fractures under isothermal, partially saturated conditions. The fractures were conceptualized as 2-D heterogeneous porous media that are characterized by their spatially correlated permeability fields. A statistical simulator, which uses a simulated annealing (SA) algorithm, was employed to generate synthetic permeability fields. Hypothesized geometric features that are expected to be relevant for seepage behavior, such as spatially correlated asperity contacts, were considered in the SA algorithm. Most importantly, a new perturbation mechanism for SA was developed in order to consider specifically the spatial correlation near conditioning asperity contacts. Numerical simulations of fluid flow and solute transport were then performed in these synthetic fractures by the flow simulator TOUGH2, assuming that the effects of matrix permeability, gas phase pressure, capillary/permeability hysteresis, and molecular diffusion can be neglected. Results of flow simulation showed that liquid seepage in partially saturated fractures is characterized by localized preferential flow, along with bypassing, funneling, and localized ponding. Seepage pattern is dominated by the fraction of asperity contracts, and their shape, size, and spatial correlation. However, the correlation structure of permeability field is less important than the spatial correlation of asperity contacts. A faster breakthrough was observed in fractures subjected to higher normal stress, accompanied with a nonlinearly decreasing trend of the effective permeability. Interestingly, seepage dispersion is generally higher in fractures with intermediate fraction of asperity contacts; but it is lower for small or large fractions of asperity contacts. However, it may become higher if the ponding becomes significant. Transport simulations indicate that tracers bypass dead-end pores and travel along flow paths that have less flow resistance. Accordingly, tracer breakthrough curves generally show more spreading than breakthrough curves for water. Further analyses suggest that the log-normal time model generally fails to fit the breakthrough curves for water, but it is a good approximation for breakthrough curves for the tracer.
2014 Data Book Shows Increased Use of Renewable Electricity - News Releases
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
| NREL 2014 Data Book Shows Increased Use of Renewable Electricity December 9, 2015 The 2014 Renewable Energy Data Book shows that U.S. renewable electricity grew to 15.5 percent of total installed capacity and 13.5 percent of total electricity generation. Published annually by the National Renewable Energy Laboratory (NREL) on behalf of the Energy Department's Office of Energy Efficiency and Renewable Energy, the Data Book illustrates United States and global energy statistics, including
NREL Releases the 2013 Renewable Energy Data Book, Detailing Increases in
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Installed Capacity - News Releases | NREL Releases the 2013 Renewable Energy Data Book, Detailing Increases in Installed Capacity January 20, 2015 The newly released 2013 Renewable Energy Data Book illustrates United States and global energy statistics, including renewable electricity generation, renewable energy development, clean energy investments, and technology-specific data and trends. The Data Book is produced and published annually by the National Renewable Energy Laboratory (NREL)
Li, Ke; Tang, Jie; Chen, Guang-Hong
2014-04-15
Purpose: To reduce radiation dose in CT imaging, the statistical model based iterative reconstruction (MBIR) method has been introduced for clinical use. Based on the principle of MBIR and its nonlinear nature, the noise performance of MBIR is expected to be different from that of the well-understood filtered backprojection (FBP) reconstruction method. The purpose of this work is to experimentally assess the unique noise characteristics of MBIR using a state-of-the-art clinical CT system. Methods: Three physical phantoms, including a water cylinder and two pediatric head phantoms, were scanned in axial scanning mode using a 64-slice CT scanner (Discovery CT750 HD, GE Healthcare, Waukesha, WI) at seven different mAs levels (5, 12.5, 25, 50, 100, 200, 300). At each mAs level, each phantom was repeatedly scanned 50 times to generate an image ensemble for noise analysis. Both the FBP method with a standard kernel and the MBIR method (Veo{sup ®}, GE Healthcare, Waukesha, WI) were used for CT image reconstruction. Three-dimensional (3D) noise power spectrum (NPS), two-dimensional (2D) NPS, and zero-dimensional NPS (noise variance) were assessed both globally and locally. Noise magnitude, noise spatial correlation, noise spatial uniformity and their dose dependence were examined for the two reconstruction methods. Results: (1) At each dose level and at each frequency, the magnitude of the NPS of MBIR was smaller than that of FBP. (2) While the shape of the NPS of FBP was dose-independent, the shape of the NPS of MBIR was strongly dose-dependent; lower dose lead to a “redder” NPS with a lower mean frequency value. (3) The noise standard deviation (?) of MBIR and dose were found to be related through a power law of ????(dose){sup ??} with the component ? ? 0.25, which violated the classical ????(dose){sup ?0.5} power law in FBP. (4) With MBIR, noise reduction was most prominent for thin image slices. (5) MBIR lead to better noise spatial uniformity when compared with FBP. (6) A composite image generated from two MBIR images acquired at two different dose levels (D1 and D2) demonstrated lower noise than that of an image acquired at a dose level of D1+D2. Conclusions: The noise characteristics of the MBIR method are significantly different from those of the FBP method. The well known tradeoff relationship between CT image noise and radiation dose has been modified by MBIR to establish a more gradual dependence of noise on dose. Additionally, some other CT noise properties that had been well understood based on the linear system theory have also been altered by MBIR. Clinical CT scan protocols that had been optimized based on the classical CT noise properties need to be carefully re-evaluated for systems equipped with MBIR in order to maximize the method's potential clinical benefits in dose reduction and/or in CT image quality improvement.
Létourneau, Daniel McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.
2014-12-15
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ±1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.
Wilson, Kevin R.; Smith, Jared D.; Kessler, Sean; Kroll, Jesse H.
2011-10-03
The heterogeneous reaction of hydroxyl radicals (OH) with squalane and bis(2-ethylhexyl) sebacate (BES) particles are used as model systems to examine how distributions of reactionproducts evolve during the oxidation of chemically reduced organic aerosol. A kinetic model of multigenerational chemistry, which is compared to previously measured (squalane) and new(BES) experimental data, reveals that it is the statistical mixtures of different generations of oxidation products that control the average particle mass and elemental composition during thereaction. The model suggests that more highly oxidized reaction products, although initially formed with low probability, play a large role in the production of gas phase reaction products.In general, these results highlight the importance of considering atmospheric oxidation as a statistical process, further suggesting that the underlying distribution of molecules could playimportant roles in aerosol formation as well as in the evolution of key physicochemical properties such as volatility and hygroscopicity.
Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure
Zhang, Meiyun; Long, Shibing Wang, Guoming; Xu, Xiaoxin; Li, Yang; Liu, Qi; Lv, Hangbing; Liu, Ming; Lian, Xiaojuan; Miranda, Enrique; Suñé, Jordi
2014-11-10
The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electron transport model. Our work provides indications for the improvement of the switching uniformity.
Delande, D.; Gay, J.C.
1986-10-20
The transition to chaos in ''the hydrogen atom in a magnetic field'' is numerically studied and shown to lead to well-defined signature on the energy-level fluctuations. Upon an increase in the energy, the calculated statistics evolve from Poisson to Gaussian orthogonal ensemble according to the regular or chaotic character of the classical motion. Several methods are employed to test the generic nature of these distributions.
SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix
Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D
2014-06-01
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.
Rosa, B.; Parishani, H.; Ayala, O.; Wang, L.-P.
2015-01-15
In this paper, we study systematically the effects of forcing time scale in the large-scale stochastic forcing scheme of Eswaran and Pope [“An examination of forcing in direct numerical simulations of turbulence,” Comput. Fluids 16, 257 (1988)] on the simulated flow structures and statistics of forced turbulence. Using direct numerical simulations, we find that the forcing time scale affects the flow dissipation rate and flow Reynolds number. Other flow statistics can be predicted using the altered flow dissipation rate and flow Reynolds number, except when the forcing time scale is made unrealistically large to yield a Taylor microscale flow Reynolds number of 30 and less. We then study the effects of forcing time scale on the kinematic collision statistics of inertial particles. We show that the radial distribution function and the radial relative velocity may depend on the forcing time scale when it becomes comparable to the eddy turnover time. This dependence, however, can be largely explained in terms of altered flow Reynolds number and the changing range of flow length scales present in the turbulent flow. We argue that removing this dependence is important when studying the Reynolds number dependence of the turbulent collision statistics. The results are also compared to those based on a deterministic forcing scheme to better understand the role of large-scale forcing, relative to that of the small-scale turbulence, on turbulent collision of inertial particles. To further elucidate the correlation between the altered flow structures and dynamics of inertial particles, a conditional analysis has been performed, showing that the regions of higher collision rate of inertial particles are well correlated with the regions of lower vorticity. Regions of higher concentration of pairs at contact are found to be highly correlated with the region of high energy dissipation rate.
Edwards, Lloyd A.; Paresol, Bernard
2014-09-01
This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Month-Long 2D Cloud-Resolving Model Simulation and Resultant Statistics of Cloud Systems Over the ARM SGP X. Wu Department of Geological and Atmospheric Sciences Iowa State University Ames, Iowa X.-Z. Liang Illinois State Water Survey University of Illinois Urbana-Champaign, Illinois Introduction The cloud-resolving model (CRM) has recently emerged as a useful tool to develop improved representations of convections, clouds, and cloud-radiation interactions in general circulation models (GCMs).
Hegerfeldt, G.C.; Henneberg, R. (Institut fuer Theoretische Physik, University of Goettingen, D-37073 Goettingen (Germany))
1994-05-01
The statistical analysis of energy levels, a powerful tool in the study of quantum systems, is applicable to discrete spectra. Here we propose an approach to carry level statistics over to continuous energy spectra, paradoxical as this may sound at first. The approach proceeds in three steps, first a discretization of the spectrum by cutoffs, then a statistical analysis of the resulting discrete spectra, and finally a determination of the limit distributions as the cutoffs are removed. In this way the notions of Wigner and Poisson distributions for nearest-neighbor spacing (NNS), usually associated with quantum chaos and regularity, can be carried over to systems with a purely continuous energy spectrum. The approach is demonstrated for the hydrogen atom in perpendicular electric and magnetic fields. This system has a purely continuous energy spectrum from [minus][infinity] to [infinity]. Depending on the field parameters, we find for the NNS a Poisson or a Wigner distribution, or a transitional behavior. We also outline how to determine physically relevant resonances in our approach by a stabilization method.
Response to several FOIA requests - Renewable Energy. | Department of
Energy Response to several FOIA requests - Renewable Energy. Reasons for Mergers and statistics indicate that IOUs are becoming larger and ownership of generation capacity among IOUs is Acquisitions Among Electric Utilities nepdg_9001_9250.pdf PDF icon Response to several FOIA requests - Renewable Energy. More Documents & Publications WA_04-001_AMENDED_SILICATES_Waiver_of_Domestic_and_Foreign_I.pdf Response to several FOIA requests - Renewable Energy. CHP: Connecting the Gap between
Webb-Robertson, Bobbie-Jo M.; Bunn, Amoret L.; Bailey, Vanessa L.
2011-01-01
Phospholipid fatty acids (PLFA) have been widely used to characterize environmental microbial communities, generating community profiles that can distinguish phylogenetic or functional groups within the community. The poor specificity of organism groups with fatty acid biomarkers in the classic PLFA-microorganism associations is a confounding factor in many of the statistical classification/clustering approaches traditionally used to interpret PLFA profiles. In this paper we demonstrate that non-linear statistical learning methods, such as a support vector machine (SVM), can more accurately find patterns related to uranyl nitrate exposure in a freshwater periphyton community than linear methods, such as partial least squares discriminant analysis. In addition, probabilistic models of exposure can be derived from the identified lipid biomarkers to demonstrate the potential model-based approach that could be used in remediation. The SVM probability model separates dose groups at accuracies of ~87.0%, ~71.4%, ~87.5%, and 100% for the four groups; Control (non-amended system), low-dose (amended at 10 µg U L-1), medium dose (amended at 100 µg U L-1), and high dose (500 µg U L-1). The SVM model achieved an overall cross-validated classification accuracy of ~87% in contrast to ~59% for the best linear classifier.
Visual Sample Plan (VSP) Statistical Software as Related to the CTBTO’s On-Site Inspection Procedure
Pulsipher, Trenton C.; Walsh, Stephen J.; Pulsipher, Brent A.; Milbrath, Brian D.
2010-09-01
In the event of a potential nuclear weapons test the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is commissioned to conduct an on-site investigation (OSI) of the suspected test site in an effort to find confirmatory evidence of the nuclear test. The OSI activities include collecting air, surface soil, and underground samples to search for indications of a nuclear weapons test - these indicators include radionuclides and radioactive isotopes Ar and Xe. This report investigates the capability of the Visual Sample Plan (VSP) software to contribute to the sampling activities of the CTBTO during an OSI. VSP is a statistical sampling design software, constructed under data quality objectives, which has been adapted for environmental remediation and contamination detection problems for the EPA, US Army, DoD and DHS among others. This report provides discussion of a number of VSP sample designs, which may be pertinent to the work undertaken during an OSI. Examples and descriptions of such designs include hot spot sampling, combined random and judgment sampling, multiple increment sampling, radiological transect surveying, and a brief description of other potentially applicable sampling methods. Further, this work highlights a potential need for the use of statistically based sample designs in OSI activities. The use of such designs may enable canvassing a sample area without full sampling, provide a measure of confidence that radionuclides are not present, and allow investigators to refocus resources in other areas of concern.
Pekney, Natalie J.; Cheng, Hanqi; Small, Mitchell J.
2015-11-05
Abstract: The objective of the current work was to develop a statistical method and associated tool to evaluate the impact of oil and natural gas exploration and production activities on local air quality.
Carroll, Robert; Lee, Chi; Tsai, Che-Wei; Yeh, Jien-Wei; Antonaglia, James; Brinkman, Braden A.W.; LeBlanc, Michael; Xie, Xie; Chen, Shuying; Liaw, Peter K; Dahmen, Karin A
2015-11-23
High-entropy alloys (HEAs) are new alloys that contain five or more elements in roughly equal proportion. We present new experiments and theory on the deformation behavior of HEAs under slow stretching (straining), and observe differences, compared to conventional alloys with fewer elements. For a specific range of temperatures and strain-rates, HEAs deform in a jerky way, with sudden slips that make it difficult to precisely control the deformation. An analytic model explains these slips as avalanches of slipping weak spots and predicts the observed slip statistics, stress-strain curves, and their dependence on temperature, strain-rate, and material composition. The ratio of the weak spotsâ€™ healing rate to the strain-rate is the main tuning parameter, reminiscent of the Portevin-LeChatellier effect and time-temperature superposition in polymers. Our model predictions agree with the experimental results. The proposed widely-applicable deformation mechanism is useful for deformation control and alloys design.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Burr, Tom; Hamada, Michael S.; Ticknor, Larry; Sprinkle, James
2015-01-01
The aim of nuclear safeguards is to ensure that special nuclear material is used for peaceful purposes. Historically, nuclear material accounting (NMA) has provided the quantitative basis for monitoring for nuclear material loss or diversion, and process monitoring (PM) data is collected by the operator to monitor the process. PM data typically support NMA in various ways, often by providing a basis to estimate some of the in-process nuclear material inventory. We develop options for combining PM residuals and NMA residuals (residual = measurement - prediction), using a hybrid of period-driven and data-driven hypothesis testing. The modified statistical tests canmoreÂ Â» be used on time series of NMA residuals (the NMA residual is the familiar material balance), or on a combination of PM and NMA residuals. The PM residuals can be generated on a fixed time schedule or as events occur.Â«Â less
Foissac, R.; Blonkowski, S.; Delcroix, P.; Kogelschatz, M.
2014-07-14
Using an ultra-high vacuum Conductive atomic force microscopy (C-AFM) current voltage, pre-breakdown negative differential resistance (NDR) characteristics are measured together with the time dependent dielectric breakdown (TDDB) distributions of Si/SiON (1.4 and 2.6?nm thick). Those experimental characteristics are systematically compared. The NDR effect is modelled by a conductive filament growth. It is showed that the Weibull TDDB statistic distribution scale factor is proportional to the growth rate of an individual filament and then has the same dependence on the electric field. The proportionality factor is a power law of the ratio between the surfaces of the CAFM tip and the filament's top. Moreover, it was found that, for the high fields used in those experiments, the TDDB acceleration factor as the growth rate characteristic is proportional to the Zener tunnelling probability. Those observations are discussed in the framework of possible breakdown or forming mechanism.
Table B1. Summary statistics for natural gas in the United States, metric equivalents, 2010-2014
U.S. Energy Information Administration (EIA) Indexed Site
8 Table B1. Summary statistics for natural gas in the United States, metric equivalents, 2010-2014 See footnotes at end of table. Number of Wells Producing at End of Year 487,627 514,637 482,822 R 484,994 514,786 Production (million cubic meters) Gross Withdrawals From Gas Wells 375,127 348,044 354,080 R 304,676 294,045 From Oil Wells 165,220 167,294 140,617 R 153,044 167,695 From Coalbed Wells 54,277 50,377 43,591 R 40,374 36,392 From Shale Gas Wells 164,723 240,721 298,257 R 337,891 389,474
Burr, Tom; Hamada, Michael S.; Ticknor, Larry; Sprinkle, James
2015-01-01
The aim of nuclear safeguards is to ensure that special nuclear material is used for peaceful purposes. Historically, nuclear material accounting (NMA) has provided the quantitative basis for monitoring for nuclear material loss or diversion, and process monitoring (PM) data is collected by the operator to monitor the process. PM data typically support NMA in various ways, often by providing a basis to estimate some of the in-process nuclear material inventory. We develop options for combining PM residuals and NMA residuals (residual = measurement - prediction), using a hybrid of period-driven and data-driven hypothesis testing. The modified statistical tests can be used on time series of NMA residuals (the NMA residual is the familiar material balance), or on a combination of PM and NMA residuals. The PM residuals can be generated on a fixed time schedule or as events occur.
Miniati, Francesco
2015-02-10
We use the Matryoshka run to study the time-dependent statistics of structure-formation-driven turbulence in the intracluster medium of a 10{sup 15} M {sub ?} galaxy cluster. We investigate the turbulent cascade in the inner megaparsec for both compressional and incompressible velocity components. The flow maintains approximate conditions of fully developed turbulence, with departures thereof settling in about an eddy-turnover time. Turbulent velocity dispersion remains above 700 km s{sup –1} even at low mass accretion rate, with the fraction of compressional energy between 10% and 40%. The normalization and the slope of the compressional turbulence are susceptible to large variations on short timescales, unlike the incompressible counterpart. A major merger occurs around redshift z ? 0 and is accompanied by a long period of enhanced turbulence, ascribed to temporal clustering of mass accretion related to spatial clustering of matter. We test models of stochastic acceleration by compressional modes for the origin of diffuse radio emission in galaxy clusters. The turbulence simulation model constrains an important unknown of this complex problem and brings forth its dependence on the elusive microphysics of the intracluster plasma. In particular, the specifics of the plasma collisionality and the dissipation physics of weak shocks affect the cascade of compressional modes with strong impact on the acceleration rates. In this context radio halos emerge as complex phenomena in which a hierarchy of processes acting on progressively smaller scales are at work. Stochastic acceleration by compressional modes implies statistical correlation of radio power and spectral index with merging cores distance, both testable in principle with radio surveys.
Brandt, Timothy D.; Spiegel, David S.; McElwain, Michael W.; Grady, C. A.; Turner, Edwin L.; Mede, Kyle; Kuzuhara, Masayuki; Schlieder, Joshua E.; Brandner, W.; Feldt, M.; Wisniewski, John P.; Abe, L.; Biller, B.; Carson, J.; Currie, T.; Egner, S.; Golota, T.; Guyon, O.; Goto, M.; Hashimoto, J.; and others
2014-10-20
We conduct a statistical analysis of a combined sample of direct imaging data, totalling nearly 250 stars. The stars cover a wide range of ages and spectral types, and include five detections (? And b, two ?60 M {sub J} brown dwarf companions in the Pleiades, PZ Tel B, and CD–35 2722B). For some analyses we add a currently unpublished set of SEEDS observations, including the detections GJ 504b and GJ 758B. We conduct a uniform, Bayesian analysis of all stellar ages using both membership in a kinematic moving group and activity/rotation age indicators. We then present a new statistical method for computing the likelihood of a substellar distribution function. By performing most of the integrals analytically, we achieve an enormous speedup over brute-force Monte Carlo. We use this method to place upper limits on the maximum semimajor axis of the distribution function derived from radial-velocity planets, finding model-dependent values of ?30-100 AU. Finally, we model the entire substellar sample, from massive brown dwarfs to a theoretically motivated cutoff at ?5 M {sub J}, with a single power-law distribution. We find that p(M, a)?M {sup –0.65} {sup ±} {sup 0.60} a {sup –0.85} {sup ±} {sup 0.39} (1? errors) provides an adequate fit to our data, with 1.0%-3.1% (68% confidence) of stars hosting 5-70 M {sub J} companions between 10 and 100 AU. This suggests that many of the directly imaged exoplanets known, including most (if not all) of the low-mass companions in our sample, formed by fragmentation in a cloud or disk, and represent the low-mass tail of the brown dwarfs.
Statistical Significance of Data
U.S. Energy Information Administration (EIA) Indexed Site
are based on data collected from a randomly chosen subset of the entire commercial building population. One source of the difference between the estimated values and the actual...
International petroleum statistics report
1996-08-01
This report provides information on current international petroleum production, demand, imports, and stocks. World oil demand and OECD demand data are presented for the years 1970 thru 1995.
International petroleum statistics report
1996-12-01
This report presents data on international oil production, demand, imports, and stocks. World oil production and OECD demand data are for the years 1970 through 1995; stocks from 1973 through 1995, and trade from 1985 through 1995.
International petroleum statistics report
1996-03-01
This report presents data on international oil production, demand, imports, exports, and stocks. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
Annual Energy Outlook [U.S. Energy Information Administration (EIA)]
Released for Printing: July 28, 2003 Printed with soy ink on recycled paper. Monthly Energy Review The Monthly Energy Review (MER) presents an overview of the Energy Information...
International Energy Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Data
International Energy Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Rankings
International petroleum statistics report
1998-01-01
This monthly publication provides international oil data for January 1998. The report presents data on oil production, demand, imports, and stocks in four sections. Section 1 containes time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. section 3 presents data on oil imports by OECD countries. Section 4 containes annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
International petroleum statistics report
Not Available
1994-06-01
This report presents data on international oil production, demand, imports, exports, and stocks. Section 1 contains time series data on world oil production, and on oil demand and stocks in the OECD. Section 2 presents an oil supply/demand balance for the world, presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production, oil stocks, demand, and trade in OECD countries.
International petroleum statistics report
Not Available
1994-05-01
This monthly publication provides current international oil data. The Report presents data on international oil production, demand, imports, exports, and stocks. Section 1 contains time series data on world oil production, and on oil demand and stocks in the OECD. Section 2 presents an oil supply/demand balance for the world. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
International petroleum statistics report
1997-11-01
This document is a monthly publication which provides current data on international oil production,demand,imports and stocks. This report has four sections which contain time series data on world oil production and oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Also included is oil supply/demand balance information for the world, and data on oil imports and trade by OECD countries.
International petroleum statistics report
Not Available
1995-01-01
This monthly publication provides current data on international oil production, demand, imports, exports, and stocks. Section 1 contains time series data on world oil production, and on oil demand and stocks in the OECD. Section 2 presents an oil supply/demand balance for the world. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
International petroleum statistics report
Not Available
1994-12-01
This document is a monthly publication that provides current international oil data. The Report presents data on international oil production, demand, imports, exports, and stocks. Section 1 contains time series data on world oil production, and on oil demand and stocks in the OECD. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand and trade in OECD countries.
AMERICAN STATISTICAL ASSOCIATION (ASA)
U.S. Energy Information Administration (EIA) Indexed Site
... So if you have a recommendation for target population it'd be presented in the target ... Do we do a good job covering petroleum marketing area, the prices and things that we're ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Software Policies User Surveys NERSC Users Group User Announcements Help Staff Blogs Request Repository Mailing List Operations for: Passwords & Off-Hours Status...
LM Stakeholder Interaction and External Communications June 2014 Page 1 OVERVIEW The U.S. Department of Energy (DOE) Offi ce of Legacy Management (LM) makes every effort to communicate with its stakeholders through public and small group meetings, conferences, briefi ngs, news releases, telephone, e-mail, informational materials, and through the LM website. To assess the effectiveness of LM's communication with stakeholders across the nation, an analysis of stakeholder interaction was performed.
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
... What we're really looking for are procedures methods that we can do that as low cost as possible, identifying where our problems are, at minimal cost is the real issue to us. MR. ...
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
... you've already found the 85 or 10 90 percent. 11 But then people used a minimal, a 12 really minimal geological information and 13 say, well, it's really not log normal, we 14 ...
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
... gas because of North American natural gas being in this ... Is it, you know, the educated lay public or industry ... agencies where you're looking at health information. Right? ...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
......... 16,459,879 17,681,003 17,089,630 18,049,663 +248,669 +1.5% Power marketing administrations: Southeastern power administration......
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
......... 15,173,972 16,243,907 5,127,000 16,391,726 +147,819 +0.9% Power marketing administrations: Southeastern power administration......
Li, K; Zhao, W; Gomez-Cardona, D; Chen, G
2014-06-15
Purpose: Automatic tube current modulation (TCM) has been widely used in modern multi-detector CT to reduce noise spatial nonuniformity and streaks to improve dose efficiency. With the advent of statistical iterative reconstruction (SIR), it is expected that the importance of TCM may diminish, since SIR incorporates statistical weighting factors to reduce the negative influence of photon-starved rays. The purpose of this work is to address the following questions: Does SIR offer the same benefits as TCM? If yes, are there still any clinical benefits to using TCM? Methods: An anthropomorphic CIRS chest phantom was scanned using a state-of-the-art clinical CT system equipped with an SIR engine (Veo™, GE Healthcare). The phantom was first scanned with TCM using a routine protocol and a low-dose (LD) protocol. It was then scanned without TCM using the same protocols. For each acquisition, both FBP and Veo reconstructions were performed. All scans were repeated 50 times to generate an image ensemble from which noise spatial nonuniformity (NSN) and streak artifact levels were quantified. Monte-Carlo experiments were performed to estimate skin dose. Results: For FBP, noise streaks were reduced by 4% using TCM for both routine and LD scans. NSN values were actually slightly higher with TCM (0.25) than without TCM (0.24) for both routine and LD scans. In contrast, for Veo, noise streaks became negligible (<1%) with or without TCM for both routine and LD scans, and the NSN was reduced to 0.10 (low dose) or 0.08 (routine). The overall skin dose was 2% lower at the shoulders and more uniformly distributed across the skin without TCM. Conclusion: SIR without TCM offers superior reduction in noise nonuniformity and streaks relative to FBP with TCM. For some clinical applications in which skin dose may be a concern, SIR without TCM may be a better option. K. Li, W. Zhao, D. Gomez-Cardona: Nothing to disclose; G.-H. Chen: Research funded, General Electric Company Research funded, Siemens AG Research funded, Varian Medical Systems, Research funded, Hologic, Inc.
Full counting statistics as a probe of quantum coherence in a side-coupled double quantum dot system
Xue, Hai-Bin
2013-12-15
We study theoretically the full counting statistics of electron transport through side-coupled double quantum dot (QD) based on an efficient particle-number-resolved master equation. It is demonstrated that the high-order cumulants of transport current are more sensitive to the quantum coherence than the average current, which can be used to probe the quantum coherence of the considered double QD system. Especially, quantum coherence plays a crucial role in determining whether the super-Poissonian noise occurs in the weak inter-dot hopping coupling regime depending on the corresponding QD-lead coupling, and the corresponding values of super-Poissonian noise can be relatively enhanced when considering the spins of conduction electrons. Moreover, this super-Poissonian noise bias range depends on the singly-occupied eigenstates of the system, which thus suggests a tunable super-Poissonian noise device. The occurrence-mechanism of super-Poissonian noise can be understood in terms of the interplay of quantum coherence and effective competition between fast-and-slow transport channels. -- Highlights: •The FCS can be used to probe the quantum coherence of side-coupled double QD system. •Probing quantum coherence using FCS may permit experimental tests in the near future. •The current noise characteristics depend on the quantum coherence of this QD system. •The super-Poissonian noise can be enhanced when considering conduction electron spin. •The side-coupled double QD system suggests a tunable super-Poissonian noise device.
Mishra, Srikanta; Schuetter, Jared
2014-11-01
We compare two approaches for building a statistical proxy model (metamodel) for COâ‚‚ geologic sequestration from the results of full-physics compositional simulations. The first approach involves a classical Box-Behnken or Augmented Pairs experimental design with a quadratic polynomial response surface. The second approach used a space-filling maxmin Latin Hypercube sampling or maximum entropy design with the choice of five different meta-modeling techniques: quadratic polynomial, kriging with constant and quadratic trend terms, multivariate adaptive regression spline (MARS) and additivity and variance stabilization (AVAS). Simulations results for COâ‚‚ injection into a reservoir-caprock system with 9 design variables (and 97 samples) were used to generate the data for developing the proxy models. The fitted models were validated with using an independent data set and a cross-validation approach for three different performance metrics: total storage efficiency, COâ‚‚ plume radius and average reservoir pressure. The Box-Behnkenâ€“quadratic polynomial metamodel performed the best, followed closely by the maximin LHSâ€“kriging metamodel.
Klenzing, J. H.; Earle, G. D.; Heelis, R. A.; Coley, W. R. [William B. Hanson Center for Space Sciences, University of Texas at Dallas, 800 W. Campbell Rd. WT15, Richardson, Texas 75080 (United States)
2009-05-15
The use of biased grids as energy filters for charged particles is common in satellite-borne instruments such as a planar retarding potential analyzer (RPA). Planar RPAs are currently flown on missions such as the Communications/Navigation Outage Forecast System and the Defense Meteorological Satellites Program to obtain estimates of geophysical parameters including ion velocity and temperature. It has been shown previously that the use of biased grids in such instruments creates a nonuniform potential in the grid plane, which leads to inherent errors in the inferred parameters. A simulation of ion interactions with various configurations of biased grids has been developed using a commercial finite-element analysis software package. Using a statistical approach, the simulation calculates collected flux from Maxwellian ion distributions with three-dimensional drift relative to the instrument. Perturbations in the performance of flight instrumentation relative to expectations from the idealized RPA flux equation are discussed. Both single grid and dual-grid systems are modeled to investigate design considerations. Relative errors in the inferred parameters for each geometry are characterized as functions of ion temperature and drift velocity.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Carroll, Robert; Lee, Chi; Tsai, Che-Wei; Yeh, Jien-Wei; Antonaglia, James; Brinkman, Braden A.W.; LeBlanc, Michael; Xie, Xie; Chen, Shuying; Liaw, Peter K; et al
2015-11-23
High-entropy alloys (HEAs) are new alloys that contain five or more elements in roughly equal proportion. We present new experiments and theory on the deformation behavior of HEAs under slow stretching (straining), and observe differences, compared to conventional alloys with fewer elements. For a specific range of temperatures and strain-rates, HEAs deform in a jerky way, with sudden slips that make it difficult to precisely control the deformation. An analytic model explains these slips as avalanches of slipping weak spots and predicts the observed slip statistics, stress-strain curves, and their dependence on temperature, strain-rate, and material composition. The ratio ofmoreÂ Â» the weak spotsâ€™ healing rate to the strain-rate is the main tuning parameter, reminiscent of the Portevin-LeChatellier effect and time-temperature superposition in polymers. Our model predictions agree with the experimental results. The proposed widely-applicable deformation mechanism is useful for deformation control and alloys design.Â«Â less
Inada, Naohisa; Oguri, Masamune; Shin, Min-Su; Kayo, Issha; Fukugita, Masataka; Strauss, Michael A.; Gott, J. Richard; Hennawi, Joseph F.; Morokuma, Tomoki; Becker, Robert H.; Gregg, Michael D.; White, Richard L.; Kochanek, Christopher S.; Chiu, Kuenley; Johnston, David E.; Clocchiatti, Alejandro; Richards, Gordon T.; Schneider, Donald P.; Frieman, Joshua A.
2010-08-15
We present the second report of our systematic search for strongly lensed quasars from the data of the Sloan Digital Sky Survey (SDSS). From extensive follow-up observations of 136 candidate objects, we find 36 lenses in the full sample of 77,429 spectroscopically confirmed quasars in the SDSS Data Release 5. We then define a complete sample of 19 lenses, including 11 from our previous search in the SDSS Data Release 3, from the sample of 36,287 quasars with i < 19.1 in the redshift range 0.6 < z < 2.2, where we require the lenses to have image separations of 1'' < {theta} < 20'' and i-band magnitude differences between the two images smaller than 1.25 mag. Among the 19 lensed quasars, three have quadruple-image configurations, while the remaining 16 show double images. This lens sample constrains the cosmological constant to be {Omega}{sub {Lambda}} = 0.84{sup +0.06}{sub -0.08}(stat.){sup +0.09}{sub -0.07}(syst.) assuming a flat universe, which is in good agreement with other cosmological observations. We also report the discoveries of seven binary quasars with separations ranging from 1.''1 to 16.''6, which are identified in the course of our lens survey. This study concludes the construction of our statistical lens sample in the full SDSS-I data set.
Rodríguez, Yeinzon; Almeida, Juan P. Beltrán; Valenzuela-Toledo, César A. E-mail: juanpbeltran@uan.edu.co
2013-04-01
We present the different consistency relations that can be seen as variations of the well known Suyama-Yamaguchi (SY) consistency relation ?{sub NL}?((6/5)f{sub NL}){sup 2}, the latter involving the levels of non-gaussianity f{sub NL} and ?{sub NL} in the primordial curvature perturbation ?. It has been (implicitly) claimed that the following variation: ?{sub NL}(k{sub 1},k{sub 3})?((6/5)){sup 2}f{sub NL}(k{sub 1})f{sub NL}(k{sub 3}), which we call ''the fourth variety'', in the collapsed (for ?{sub NL}) and squeezed (for f{sub NL}) limits is always satisfied independently of any physics; however, the proof depends sensitively on the assumption of scale-invariance (expressing this way the fourth variety of the SY consistency relation as ?{sub NL}?((6/5)f{sub NL}){sup 2}) which only applies for cosmological models involving Lorentz-invariant scalar fields (at least at tree level), leaving room for a strong violation of this variety of the consistency relation when non-trivial degrees of freedom, for instance vector fields, are in charge of the generation of the primordial curvature perturbation. With this in mind as a motivation, we explicitly state, in the first part of this work, under which conditions the SY consistency relation has been claimed to hold in its different varieties (implicitly) presented in the literature since its inception back in 2008; as a result, we show for the first time that the variety ?{sub NL}(k{sub 1},k{sub 1})?((6/5)f{sub NL}(k{sub 1})){sup 2}, which we call ''the fifth variety'', is always satisfied even when there is strong scale-dependence and high levels of statistical anisotropy as long as statistical homogeneity holds: thus, an observed violation of this specific variety would prevent the comparison between theory and observation, shaking this way the foundations of cosmology as a science. In the second part, we concern about the existence of non-trivial degrees of freedom, concretely vector fields for which the levels of non-gaussianity have been calculated for very few models; among them, and by making use of the ?N formalism at tree level, we study a class of models that includes the vector curvaton scenario, vector inflation, and the hybrid inflation with coupled vector and scalar ''waterfall field'' where ? is generated at the end of inflation, finding that the fourth variety of the SY consistency relation is indeed strongly violated for some specific wavevector configurations while the fifth variety continues to be well satisfied. Finally, as a byproduct of our investigation, we draw attention to a quite recently demonstrated variety of the SY consistency relation: ?{sup iso}{sub NL}?((6/5)f{sup iso}{sub NL}){sup 2}, in scenarios where scalar and vector fields contribute to the generation of the primordial curvature perturbation; this variety of the SY consistency relation is satisfied although the isotropic pieces of the non-gaussianity parameters receive contributions from the vector fields. We discuss further implications for observational cosmology.
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta
2015-03-11
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (wgâ‚Š) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensormoreÂ Â» but that luminosity versus mass weighting has only negligible effects. Both ED and wgâ‚Š correlations increase in amplitude with subhalo mass (in the range of 10Â¹â° â€“ 6.0 X 10Â¹â´hâ»Â¹ MâŠ™), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a wgâ‚Š that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5â€“18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that wgâ‚Š (using subhalos as tracers of density and wÎ´ (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.Â«Â less
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta
2015-03-11
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (w_{g}?) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensor but that luminosity versus mass weighting has only negligible effects. Both ED and w_{g}? correlations increase in amplitude with subhalo mass (in the range of 10¹? – 6.0 X 10¹?h?¹ M_{?}), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a w_{g}? that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5–18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that w_{g}? (using subhalos as tracers of density and w_{?} (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.
Powell, M.A.; Rawlinson, K.S.
1992-01-01
A kinetic Stirling cycle engine, the Stirling Thermal Motors (STM) STM4-120, was tested at the Sandia National Laboratories Engine Test Facility (ETF) from March 1989--August 1992. Sandia is interested in determining this engine's potential for solar-thermal-electric applications. The last round of testing was conducted from July--August 1992 using Sandia-designed gas-fired heat pipe evaporators as the heat input system to the engine. The STM4-120 was performance mapped over a range of sodium vapor temperatures, cooling water temperatures, and cycle pressures. The resulting shaft power output levels ranged from 5--9 kW. The engine demonstrated high conversion efficiency (24--31%) even though the power output level was less than 40% of the rated output of 25 kW. The engine had been previously derated from 25 kW to 10 kW shaft power due to mechanical limitations that were identified by STM during parallel testing at their facility in Ann Arbor, MI. A statistical method was used to design the experiment, to choose the experimental points, and to generate correlation equations describing the engine performance given the operating parameters. The testing was trunacted due to a failure of the heat pipe system caused by entrainment of liquid sodium in the condenser section of the heat pipes. Enough data was gathered to generate the correlations and to demonstrate the experimental technique. The correlation is accurate in the experimental space and is simple enough for use in hand calculations and spreadsheet-based system models. Use of this method can simplify the construction of accurate performance and economic models of systems in which the engine is a component. The purpose of this paper is to present the method used to design the experiments and to analyze the performance data.
Powell, M.A.; Rawlinson, K.S.
1992-12-31
A kinetic Stirling cycle engine, the Stirling Thermal Motors (STM) STM4-120, was tested at the Sandia National Laboratories Engine Test Facility (ETF) from March 1989--August 1992. Sandia is interested in determining this engine`s potential for solar-thermal-electric applications. The last round of testing was conducted from July--August 1992 using Sandia-designed gas-fired heat pipe evaporators as the heat input system to the engine. The STM4-120 was performance mapped over a range of sodium vapor temperatures, cooling water temperatures, and cycle pressures. The resulting shaft power output levels ranged from 5--9 kW. The engine demonstrated high conversion efficiency (24--31%) even though the power output level was less than 40% of the rated output of 25 kW. The engine had been previously derated from 25 kW to 10 kW shaft power due to mechanical limitations that were identified by STM during parallel testing at their facility in Ann Arbor, MI. A statistical method was used to design the experiment, to choose the experimental points, and to generate correlation equations describing the engine performance given the operating parameters. The testing was trunacted due to a failure of the heat pipe system caused by entrainment of liquid sodium in the condenser section of the heat pipes. Enough data was gathered to generate the correlations and to demonstrate the experimental technique. The correlation is accurate in the experimental space and is simple enough for use in hand calculations and spreadsheet-based system models. Use of this method can simplify the construction of accurate performance and economic models of systems in which the engine is a component. The purpose of this paper is to present the method used to design the experiments and to analyze the performance data.
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
Wang, Minghuai; Liu, Xiaohong; Zhang, Kai; Comstock, Jennifer M.
2014-09-01
A statistical cirrus cloud scheme that tracks ice saturation ratio in the clear-sky and cloudy portion of a grid box separately has been implemented into NCAR CAM5 to provide a consistent treatment of ice nucleation and cloud formation. Simulated ice supersaturation and ice crystal number concentrations strongly depend on the number concentrations of heterogeneous ice nuclei (IN), subgrid temperature formulas and the number concentration of sulfate particles participating in homogeneous freezing, while simulated ice water content is insensitive to these perturbations. 1% to 10% dust particles serving as heterogeneous IN is 20 found to produce ice supersaturaiton in better agreement with observations. Introducing a subgrid temperature perturbation based on long-term aircraft observations of meso-scale motion produces a better hemispheric contrast in ice supersaturation compared to observations. Heterogeneous IN from dust particles significantly alter the net radiative fluxes at the top of atmosphere (TOA) (-0.24 to -1.59 W m-2) with a significant clear-sky longwave component (0.01 to -0.55 W m-2). Different cirrus treatments significantly perturb the net TOA anthropogenic aerosol forcing from -1.21 W m-2 to -1.54 W m-2, with a standard deviation of 0.10 W m-2. Aerosol effects on cirrus clouds exert an even larger impact on the atmospheric component of the radiative fluxes (two or three times the changes in the TOA radiative fluxes) and therefore on the hydrology cycle through the fast atmosphere response. This points to the urgent need to quantify aerosol effects on cirrus clouds through ice nucleation and how these further affect the hydrological cycle.
MIAO Rong-Zhi; WU Guo-Hua; ZENG Wei-Han; LIU Jian-Ye; YU Chao-Fan; YU Xian
1985-10-01
It is assumed that the initial exciton number n/sub 0/ is statistical. The expression of the probability h(n/sub 0/) for any probable n/sub 0/ is given. The theoretical calculation results, including the energy spectra and the double differential cross sections, are obtained by weighted summation of the contributions coming from various probable n/sub 0/. The agreement between experimental data and the theoretical results is quite good.
Hoon Sohn; Charles Farrar; Norman Hunter; Keith Worden
2001-01-01
This report summarizes the analysis of fiber-optic strain gauge data obtained from a surface-effect fast patrol boat being studied by the staff at the Norwegian Defense Research Establishment (NDRE) in Norway and the Naval Research Laboratory (NRL) in Washington D.C. Data from two different structural conditions were provided to the staff at Los Alamos National Laboratory. The problem was then approached from a statistical pattern recognition paradigm. This paradigm can be described as a four-part process: (1) operational evaluation, (2) data acquisition & cleansing, (3) feature extraction and data reduction, and (4) statistical model development for feature discrimination. Given that the first two portions of this paradigm were mostly completed by the NDRE and NRL staff, this study focused on data normalization, feature extraction, and statistical modeling for feature discrimination. The feature extraction process began by looking at relatively simple statistics of the signals and progressed to using the residual errors from auto-regressive (AR) models fit to the measured data as the damage-sensitive features. Data normalization proved to be the most challenging portion of this investigation. A novel approach to data normalization, where the residual errors in the AR model are considered to be an unmeasured input and an auto-regressive model with exogenous inputs (ARX) is then fit to portions of the data exhibiting similar waveforms, was successfully applied to this problem. With this normalization procedure, a clear distinction between the two different structural conditions was obtained. A false-positive study was also run, and the procedure developed herein did not yield any false-positive indications of damage. Finally, the results must be qualified by the fact that this procedure has only been applied to very limited data samples. A more complete analysis of additional data taken under various operational and environmental conditions as well as other structural conditions is necessary before one can definitively state that the procedure is robust enough to be used in practice.
Quantum of area {Delta}A=8{pi}l{sub P}{sup 2} and a statistical interpretation of black hole entropy
Ropotenko, Kostiantyn
2010-08-15
In contrast to alternative values, the quantum of area {Delta}A=8{pi}l{sub P}{sup 2} does not follow from the usual statistical interpretation of black hole entropy; on the contrary, a statistical interpretation follows from it. This interpretation is based on the two concepts: nonadditivity of black hole entropy and Landau quantization. Using nonadditivity a microcanonical distribution for a black hole is found and it is shown that the statistical weight of a black hole should be proportional to its area. By analogy with conventional Landau quantization, it is shown that quantization of a black hole is nothing but the Landau quantization. The Landau levels of a black hole and their degeneracy are found. The degree of degeneracy is equal to the number of ways to distribute a patch of area 8{pi}l{sub P}{sup 2} over the horizon. Taking into account these results, it is argued that the black hole entropy should be of the form S{sub bh}=2{pi}{center_dot}{Delta}{Gamma}, where the number of microstates is {Delta}{Gamma}=A/8{pi}l{sub P}{sup 2}. The nature of the degrees of freedom responsible for black hole entropy is elucidated. The applications of the new interpretation are presented. The effect of noncommuting coordinates is discussed.
Koner, Debasish; Panda, Aditya N.; Barrios, Lizandra; GonzÃ¡lez-Lezana, TomÃ¡s
2014-09-21
A real wave packet based time-dependent method and a statistical quantum method have been used to study the He + NeH{sup +} (v, j) reaction with the reactant in various ro-vibrational states, on a recently calculated ab initio ground state potential energy surface. Both the wave packet and statistical quantum calculations were carried out within the centrifugal sudden approximation as well as using the exact Hamiltonian. Quantum reaction probabilities exhibit dense oscillatory pattern for smaller total angular momentum values, which is a signature of resonances in a complex forming mechanism for the title reaction. Significant differences, found between exact and approximate quantum reaction cross sections, highlight the importance of inclusion of Coriolis coupling in the calculations. Statistical results are in fairly good agreement with the exact quantum results, for ground ro-vibrational states of the reactant. Vibrational excitation greatly enhances the reaction cross sections, whereas rotational excitation has relatively small effect on the reaction. The nature of the reaction cross section curves is dependent on the initial vibrational state of the reactant and is typical of a late barrier type potential energy profile.
Ladd-Lively, Jennifer L
2014-01-01
The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component in the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.
Shirasaki, Masato; Yoshida, Naoki
2014-05-01
The measurement of cosmic shear using weak gravitational lensing is a challenging task that involves a number of complicated procedures. We study in detail the systematic errors in the measurement of weak-lensing Minkowski Functionals (MFs). Specifically, we focus on systematics associated with galaxy shape measurements, photometric redshift errors, and shear calibration correction. We first generate mock weak-lensing catalogs that directly incorporate the actual observational characteristics of the Canada-France-Hawaii Lensing Survey (CFHTLenS). We then perform a Fisher analysis using the large set of mock catalogs for various cosmological models. We find that the statistical error associated with the observational effects degrades the cosmological parameter constraints by a factor of a few. The Subaru Hyper Suprime-Cam (HSC) survey with a sky coverage of ?1400 deg{sup 2} will constrain the dark energy equation of the state parameter with an error of ?w {sub 0} ? 0.25 by the lensing MFs alone, but biases induced by the systematics can be comparable to the 1? error. We conclude that the lensing MFs are powerful statistics beyond the two-point statistics only if well-calibrated measurement of both the redshifts and the shapes of source galaxies is performed. Finally, we analyze the CFHTLenS data to explore the ability of the MFs to break degeneracies between a few cosmological parameters. Using a combined analysis of the MFs and the shear correlation function, we derive the matter density ?{sub m0}=0.256±{sub 0.046}{sup 0.054}.
Wong, Cheuk-Yin; Wilk, Grzegorz; Cirto, Leonardo J. L.; Tsallis, Constantino
2015-01-01
Transverse spectra of both jets and hadrons obtained in high-energy $pp$ and $p\\bar p $ collisions at central rapidity exhibit power-law behavior of $1/p_T^n$ at high $p_T$. The power index $n$ is 4-5 for jet production and is slightly greater for hadron production. Furthermore, the hadron spectra spanning over 14 orders of magnitude down to the lowest $p_T$ region in $pp$ collisions at LHC can be adequately described by a single nonextensive statistical mechanical distribution that is widely used in other branches of science. This suggests indirectly the dominance of the hard-scattering process over essentially the whole $p_T$ region at central rapidity in $pp$ collisions at LHC. We show here direct evidences of such a dominance of the hard-scattering process by investigating the power index of UA1 jet spectra over an extended $p_T$ region and the two-particle correlation data of the STAR and PHENIX Collaborations in high-energy $pp$ and $p \\bar p$ collisions at central rapidity. We then study how the showering of the hard-scattering product partons alters the power index of the hadron spectra and leads to a hadron distribution that can be cast into a single-particle non-extensive statistical mechanical distribution. Because of such a connection, the non-extensive statistical mechanical distribution can be considered as a lowest-order approximation of the hard-scattering of partons followed by the subsequent process of parton showering that turns the jets into hadrons, in high energy $pp$ and $p\\bar p$ collisions.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Renewable Energy The WIPP Site Holds Promise as an Ideal Source of Renewable Energy Encompassing 16 square miles of open Chihuahuan desert with abundant sunshine and minimal...