hp calculators HP 50g Confidence Intervals Real Estate
Vetter, Frederick J.
hp calculators HP 50g Confidence Intervals Real Estate The STAT menu Confidence Intervals Practice finding confidence intervals Real Estate #12;hp calculators HP 50g Confidence Intervals Real Estate hp calculators - 2 - HP 50g Confidence Intervals Real Estate The STAT menu The Statistics menu
Linear Regression Sample Proportions Interpretation of the Confidence Interval Interval Estimation
Watkins, Joseph C.
Linear Regression Sample Proportions Interpretation of the Confidence Interval Topic 16 Interval Estimation Additional Topics 1 / 9 #12;Linear Regression Sample Proportions Interpretation of the Confidence Interval Outline Linear Regression Sample Proportions Interpretation of the Confidence Interval 2 / 9 #12
Honest Confidence Intervals for the Error Variance in Stepwise Regression
Stine, Robert A.
Honest Confidence Intervals for the Error Variance in Stepwise Regression Dean P. Foster and Robert alternatives are used. These simpler algorithms (e.g., forward or backward stepwise regression) obtain
Setting confidence intervals for bounded parameters a different perspective
Fraser, D A S; Wong, A C M
2003-01-01
The estimation of signal frequency count in the presence of background noise has had much recent discussion in the physics literature, and Mandelkern [1] brings the core issues to the statistical community, in turn leading to extensive discussion by statisticians. The primary focus in [1] and in the discussion rests on confidence interval procedures. We discuss various anomalies and misleading features in this use of confidence theory, and argue that the usage is essentially decision theoretic and is being applied in a context that invites an inferential approach. We then extract what we view as the inference elements, the fundamental information available from the model and the data. This is illustrated using some simple data and some recent data from the physics literature.
Approximate and Fiducial Confidence Intervals for the Difference Between Two Binomial Proportions
Krishnamoorthy, Kalimuthu
Approximate and Fiducial Confidence Intervals for the Difference Between Two Binomial Proportions K of estimating the difference between two binomial proportions is considered. Closed-form approximate confidence intervals (CIs), and a fiducial CI for the difference between proportions are proposed. The approximate CIs
PCA-based bootstrap confidence interval tests for gene-disease association involving multiple SNPs
Peng, Qianqian; Zhao, Jinghua; Xue, Fuzhong
2010-01-26
confidence interval test (PCA-BCIT), which directly uses the PC scores to assess gene-disease association, was developed and evaluated for three ways of extracting PCs, i.e., cases only(CAES), controls only(COES) and cases and controls combined...
Doebling, S.W.; Farrar, C.R. [Los Alamos National Lab., NM (United States); Cornwell, P.J. [Rose Hulman Inst. of Tech., Terre Haute, IN (United States)
1998-02-01
This paper presents a comparison of two techniques used to estimate the statistical confidence intervals on modal parameters identified from measured vibration data. The first technique is Monte Carlo simulation, which involves the repeated simulation of random data sets based on the statistics of the measured data and an assumed distribution of the variability in the measured data. A standard modal identification procedure is repeatedly applied to the randomly perturbed data sets to form a statistical distribution on the identified modal parameters. The second technique is the Bootstrap approach, where individual Frequency Response Function (FRF) measurements are randomly selected with replacement to form an ensemble average. This procedure, in effect, randomly weights the various FRF measurements. These weighted averages of the FRFs are then put through the modal identification procedure. The modal parameters identified from each randomly weighted data set are then used to define a statistical distribution for these parameters. The basic difference in the two techniques is that the Monte Carlo technique requires the assumption on the form of the distribution of the variability in the measured data, while the bootstrap technique does not. Also, the Monte Carlo technique can only estimate random errors, while the bootstrap statistics represent both random and bias (systematic) variability such as that arising from changing environmental conditions. However, the bootstrap technique requires that every frequency response function be saved for each average during the data acquisition process. Neither method can account for bias introduced during the estimation of the FRFs. This study has been motivated by a program to develop vibration-based damage identification procedures.
Confidence belts on bounded parameters
J. Bouchez
2000-01-13
We show that the unified method recently proposed by Feldman and Cousins to put confidence intervals on bounded parameters cannot avoid the possibility of getting null results. A modified bayesian approach is also proposed (although not advocated) which ensures no null results and proper coverage.
Office of Environmental Management (EM)
Long-Term Waste Confidence Update Christine Pineda Office of Nuclear Material Safety and Safeguards U.S. Nuclear Regulatory Commission National Transportation Stakeholders Forum...
ON CONFIDENCE INTERVALS ASSOCIATED WITH THE USUAL AND ADJUSTED LIKELIHOODS
Reid, Nancy
Institute of Management Post Box No 16757 Calcutta 700 027, India N. Reid Department of Statistics by like lihood ratio statistics arising from the usual profile likelihood and various adjustments thereof statistic; parameter orthogonality; point estimation. 1. Introduction In recent years, there has been
Golden, M.
2013-01-01
Environmental Defense Fund’s Investor Confidence Project Delivering Investment Quality Energy Efficiency to Market ESL-KT-13-12-38 CATEE 2013: Clean Air Through Energy Efficiency Conference, San Antonio, Texas Dec. 16-18 Investor Confidence Project... Actionable Data ESL-KT-13-12-38 CATEE 2013: Clean Air Through Energy Efficiency Conference, San Antonio, Texas Dec. 16-18 Near-Term: Not Enough Deal-Flow • High Transaction Costs • Lack of Viable Origination Channels • Highly Variable Performance • Complex...
CBECS 1992 - Detailed Tables Word Definitions
U.S. Energy Information Administration (EIA) Indexed Site
Confidence Levels Confidence Levels The 95-percent confidence range can be determined using the approximate standard error of the estimate. To calculate the 95-percent confidence...
Random sets and confidence procedures
Barnett, William A.
1979-06-01
) —* (Y, -T, (Qe)eee) be a random set with Y C ^ ( 0 ) - {0} and with Qe the probability distribution of S induced on Y by P0. Assume that S is surjective. The relation of statistical confidence sets to the following definition will be investigated... of confidence procedures now can be defined. DEFINITION 6. Let S be a confidence procedure. Then S has (lower) confidence level y — inl{Q6{êe) \\ 6 ^ Q). If S is a confidence pro cedure, and if x E ST, then S(x) will be called a confidence subset of 0...
Anderson, J.
1995-10-01
Results of a financial ranking survey of power projects show reasonably strong activity when compared to previous surveys. Perhaps the most notable trend is the continued increase in the number of international deals being reported. Nearly 62 percent of the transactions reported were for non-US projects. This increase will likely expand with time as developers and lenders gain confidence in certain regions. For the remainder of 1995 and into 1996 it is likely that financial activity will continue at a steady pace. A number of projects in various markets are poised to reach financial close relatively soon. Developers, investment bankers, and governments are all gaining experience and becoming more comfortable with the process.
W. B. Vasantha Kandasamy; Florentin Smarandache
2010-12-08
In this book we use only special types of intervals and introduce the notion of different types of interval linear algebras and interval vector spaces using the intervals of the form [0, a] where the intervals are from Zn or Z+ \\cup {0} or Q+ \\cup {0} or R+ \\cup {0}. A systematic development is made starting from set interval vector spaces to group interval vector spaces. Vector spaces are taken as interval polynomials or interval matrices or just intervals over suitable sets or semigroups or groups. Main feature of this book is the authors have given over 350 examples. This book has six chapters. Chapter one is introductory in nature. Chapter two introduces the notion of set interval linear algebras of type one and two. Set fuzzy interval linear algebras and their algebras and their properties are discussed in chapter three. Chapter four introduces several types of interval linear bialgebras and bivector spaces and studies them. The possible applications are given in chapter five. Chapter six suggests nearly 110 problems of all levels.
Confidence Measures for Evaluating Pronunciation Models
Williams, Gethin; Renals, Steve
In this paper, we investigate the use of confidence measures for the evaluation of pronunciation models and the employment of these evaluations in an automatic baseform learning process. The confidence measures and ...
A recipe for the construction of confidence limits
Iain A Bertram et al.
2000-04-12
In this note, the authors present the recipe recommended by the Search Limits Committee for the construction of confidence intervals for the use of D0 collaboration. In another note, currently in preparation, they present the rationale for this recipe, a critique of the current literature on this topic, and several examples of the use of the method. This note is intended to fill the need of the collaboration to have a reference available until the more complete note is finished. Section 2 introduces the notation used in this note, and Section 3 contains the suggested recipe.
Masci, Frank
of the beta distribution using modern mathematical software packages (e.g. R, MATLAB, MATHEMATICA, IDL, PYTHON
Confidence intervals for the encircled energy fraction and the half energy width
Vacanti, Giuseppe
2015-01-01
The Encircled Energy Fraction and its quantiles, notably the Half Energy Width, are routinely used to characterize the quality of X-ray optical systems. They are however always quoted without a statistical error. We show how non-parametric statistical methods can be used to redress this situation, and we discuss how the knowledge of the statistical error can be used to speed up the characterization efforts for future X-ray observatories.
Computing confidence intervals on solution costs for stochastic grid generation expansion problems.
Woodruff, David L..; Watson, Jean-Paul
2010-12-01
A range of core operations and planning problems for the national electrical grid are naturally formulated and solved as stochastic programming problems, which minimize expected costs subject to a range of uncertain outcomes relating to, for example, uncertain demands or generator output. A critical decision issue relating to such stochastic programs is: How many scenarios are required to ensure a specific error bound on the solution cost? Scenarios are the key mechanism used to sample from the uncertainty space, and the number of scenarios drives computational difficultly. We explore this question in the context of a long-term grid generation expansion problem, using a bounding procedure introduced by Mak, Morton, and Wood. We discuss experimental results using problem formulations independently minimizing expected cost and down-side risk. Our results indicate that we can use a surprisingly small number of scenarios to yield tight error bounds in the case of expected cost minimization, which has key practical implications. In contrast, error bounds in the case of risk minimization are significantly larger, suggesting more research is required in this area in order to achieve rigorous solutions for decision makers.
Surveillance test interval optimization
Cepin, M.; Mavko, B. [Institut Jozef Stefan, Ljublijana (Slovenia)
1995-12-31
Technical specifications have been developed on the bases of deterministic analyses, engineering judgment, and expert opinion. This paper introduces our risk-based approach to surveillance test interval (STI) optimization. This approach consists of three main levels. The first level is the component level, which serves as a rough estimation of the optimal STI and can be calculated analytically by a differentiating equation for mean unavailability. The second and third levels give more representative results. They take into account the results of probabilistic risk assessment (PRA) calculated by a personal computer (PC) based code and are based on system unavailability at the system level and on core damage frequency at the plant level.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Partisanship and Voter Confidence, 2000-2012
Sances, Michael W.
2014-06-01
To what degree is voter confidence in election procedures driven by satisfaction with the outcome of an election, as opposed to trust in government or objective features of the polling place, such as voting technology? ...
Confidence Measures Derived from an Acceptor HMM
Williams, Gethin; Renals, Steve
In this paper we define a number of confidence measures derived from an acceptor HMM and evaluate their performance for the task of utterance verification using the North American Business News (NAB) and Broadcast News (BN) corpora. Results...
Florida consumer confidence holds steady in May
Belogay, Eugene A.
. Consumer confidence held steady at 68 in May after dropping for three months since Feb. 1 when gasoline prices began shooting up, according to a new survey. But Floridians' perceptions of their own finances.8 from a revised 66 in April on worries about jobs and inflation for groceries and gasoline. The survey
Confidence in the neutrino mass hierarchy
Evslin, Jarah
2013-01-01
The number of sigma of confidence in a determination of the neutrino mass hierarchy may be obtained from the statistic Delta chi squared. However, as the hierarchy is a discrete variable, this number is not given by the usual square root formula. We review a simple Bayesian formula for the confidence in the hierarchy determination that can be obtained from the median experiment as a function of Delta chi squared. We compare this analytical formula to 6 years of simulated data from JUNO together with a 4% (1%) determination of the effective atmospheric mass splitting from the disappearance channel at MINOS (NOvA). We find a Delta chi squared of 11 (20) yielding 2.6 sigma (3.9 sigma) of confidence. However when the unknown nonlinear energy response of the detector is included in our analysis this significance degrades considerably. This degradation can be eliminated by dividing the single detector into a near and far detector of the same total target mass. A further advantage of a second detector is that, even ...
High resolution time interval meter
Martin, A.D.
1986-05-09
Method and apparatus are provided for measuring the time interval between two events to a higher resolution than reliability available from conventional circuits and component. An internal clock pulse is provided at a frequency compatible with conventional component operating frequencies for reliable operation. Lumped constant delay circuits are provided for generating outputs at delay intervals corresponding to the desired high resolution. An initiation START pulse is input to generate first high resolution data. A termination STOP pulse is input to generate second high resolution data. Internal counters count at the low frequency internal clock pulse rate between the START and STOP pulses. The first and second high resolution data are logically combined to directly provide high resolution data to one counter and correct the count in the low resolution counter to obtain a high resolution time interval measurement.
INTERVAL METHODS IN REMOTE SENSING
Ward, Karen
INTERVAL METHODS IN REMOTE SENSING: RELIABLE SUBDIVISION OF GEOLOGICAL AREAS David D. Coblentz, G of the locations which weren't that thoroughly analyzed. 1 #12; 2 The subdivision of a geological zone TOPOGRAPHIC INFORMATION One reason for subjectivity of the geological subdivision is the fact
Frequent-Interval Seismic CPTu
Office of Energy Efficiency and Renewable Energy (EERE)
Frequent-Interval Seismic CPTu D. Bruce Nothdurft, MSCE, PE, PG SRS Geotechnical Engineering Department Savannah River Nuclear Solutions Alec V. McGillivray, PhD, PE Geotechnical Consultant Brent J. Gutierrez, PhD, PE NPH Engineering Manager, DOE-SR
BasaltBasalt Rock units Cased interval Well completion Slotted casing Submersible pump Water level bls. Submersible pump intake near 262 ft bls; depth to water is 245.24 taken October 2, 2014. #12;GAM. Submersible pump intake near 604 ft bls. Depth to water is 600.32 taken October 9, 2014. #12;GAM(NAT) 0 75API
Diagnosing Anomalous Network Performance with Confidence
Settlemyer, Bradley W; Hodson, Stephen W; Kuehn, Jeffery A; Poole, Stephen W
2011-04-01
Variability in network performance is a major obstacle in effectively analyzing the throughput of modern high performance computer systems. High performance interconnec- tion networks offer excellent best-case network latencies; how- ever, highly parallel applications running on parallel machines typically require consistently high levels of performance to adequately leverage the massive amounts of available computing power. Performance analysts have usually quantified network performance using traditional summary statistics that assume the observational data is sampled from a normal distribution. In our examinations of network performance, we have found this method of analysis often provides too little data to under- stand anomalous network performance. Our tool, Confidence, instead uses an empirically derived probability distribution to characterize network performance. In this paper we describe several instances where the Confidence toolkit allowed us to understand and diagnose network performance anomalies that we could not adequately explore with the simple summary statis- tics provided by traditional measurement tools. In particular, we examine a multi-modal performance scenario encountered with an Infiniband interconnection network and we explore the performance repeatability on the custom Cray SeaStar2 interconnection network after a set of software and driver updates.
Sample sizes for confidence limits for reliability.
Darby, John L.
2010-02-01
We recently performed an evaluation of the implications of a reduced stockpile of nuclear weapons for surveillance to support estimates of reliability. We found that one technique developed at Sandia National Laboratories (SNL) under-estimates the required sample size for systems-level testing. For a large population the discrepancy is not important, but for a small population it is important. We found that another technique used by SNL provides the correct required sample size. For systems-level testing of nuclear weapons, samples are selected without replacement, and the hypergeometric probability distribution applies. Both of the SNL techniques focus on samples without defects from sampling without replacement. We generalized the second SNL technique to cases with defects in the sample. We created a computer program in Mathematica to automate the calculation of confidence for reliability. We also evaluated sampling with replacement where the binomial probability distribution applies.
be heated in a pressure vessel or oil bath, and some temperature-measuring device (e.g., a thermocouple, there is no standard method to estimate kinetic parameters in low-moisture, conduction-heated foods subject and maintain pressure. As temperatures and pressure increase, measuring sample temperature may become
Confidence in the neutrino mass hierarchy
Jarah Evslin
2013-11-14
The number of sigma of confidence in a determination of the neutrino mass hierarchy may be obtained from the statistic Delta chi squared. However, as the hierarchy is a discrete variable, this number is not given by the usual square root formula. We review a simple Bayesian formula for the sensitivity to the hierarchy that can be obtained from the median experiment as a function of Delta chi squared. We compare this analytical formula to 6 years of simulated data from JUNO together with a 4% (1%) determination of the effective atmospheric mass splitting from the disappearance channel at MINOS (NOvA). We find a Delta chi squared of 11 (20) yielding 2.6 sigma (3.9 sigma). However when the unknown nonlinear energy response of the detector is included in our analysis this significance degrades considerably. This degradation can be eliminated by dividing the single detector into a near and far detector of the same total target mass. A further advantage of a second detector is that, even while the reactor neutrino experiment runs, the decay at rest of a single, high intensity, continuously running pion source close to one of the detectors, such as that described by the DAEdALUS project, may determine the leptonic CP-violating phase delta.
High resolution time interval counter
Condreva, K.J.
1994-07-26
A high resolution counter circuit measures the time interval between the occurrence of an initial and a subsequent electrical pulse to two nanoseconds resolution using an eight megahertz clock. The circuit includes a main counter for receiving electrical pulses and generating a binary word--a measure of the number of eight megahertz clock pulses occurring between the signals. A pair of first and second pulse stretchers receive the signal and generate a pair of output signals whose widths are approximately sixty-four times the time between the receipt of the signals by the respective pulse stretchers and the receipt by the respective pulse stretchers of a second subsequent clock pulse. Output signals are thereafter supplied to a pair of start and stop counters operable to generate a pair of binary output words representative of the measure of the width of the pulses to a resolution of two nanoseconds. Errors associated with the pulse stretchers are corrected by providing calibration data to both stretcher circuits, and recording start and stop counter values. Stretched initial and subsequent signals are combined with autocalibration data and supplied to an arithmetic logic unit to determine the time interval in nanoseconds between the pair of electrical pulses being measured. 3 figs.
High resolution time interval counter
Condreva, Kenneth J. (Livermore, CA)
1994-01-01
A high resolution counter circuit measures the time interval between the occurrence of an initial and a subsequent electrical pulse to two nanoseconds resolution using an eight megahertz clock. The circuit includes a main counter for receiving electrical pulses and generating a binary word--a measure of the number of eight megahertz clock pulses occurring between the signals. A pair of first and second pulse stretchers receive the signal and generate a pair of output signals whose widths are approximately sixty-four times the time between the receipt of the signals by the respective pulse stretchers and the receipt by the respective pulse stretchers of a second subsequent clock pulse. Output signals are thereafter supplied to a pair of start and stop counters operable to generate a pair of binary output words representative of the measure of the width of the pulses to a resolution of two nanoseconds. Errors associated with the pulse stretchers are corrected by providing calibration data to both stretcher circuits, and recording start and stop counter values. Stretched initial and subsequent signals are combined with autocalibration data and supplied to an arithmetic logic unit to determine the time interval in nanoseconds between the pair of electrical pulses being measured.
Algebraic Structures Using Super Interval Matrices
W. B. Vasantha Kandasamy; Florentin Smarandache
2011-10-01
In this book super interval matrices using the special type of intervals of the form [0, a] are introduced. Several algebraic structures like semigroups, groups, semirings, rings, semivector spaces and vector spaces are introduced. Special fuzzy linear algebras are introduced using the concept of super fuzzy interval matrices.
Algebraic Structures using Natural Class of Intervals
W. B. Vasantha Kandasamy; Florentin Smarandache
2011-07-02
This book has eleven chapters. Chapter one describes all types of natural class of intervals and the arithmetic operations on them. Chapter two introduces the semigroup of natural class of intervals using R or Zn and study the properties associated with them. Chapter three studies the notion of rings constructed using the natural class of intervals. Matrix theory using the special class of intervals is analyzed in chapter four of this book. Chapter five deals with polynomials using interval coefficients. New types of rings of natural intervals are introduced and studied in chapter six. The notion of vector space using natural class of intervals is built in chapter seven. In chapter eight fuzzy natural class of intervals are introduced and algebraic structures on them is built and described. Algebraic structures using natural class of neutrosophic intervals are developed in chapter nine.Chapter ten suggests some possible applications. The final chapter proposes over 200 problems of which some are at research level and some difficult and others are simple.
Watchdog: Confident Event Detection in Heterogeneous Sensor Networks
Zhou, Gang
observations may easily yield a confident event detection decision with a small, energy-efficient cluster with vehicle detection trace data and a building traffic monitoring testbed of IRIS motes, we demonstrate
NOVELTY, CONFIDENCE & ERRORS IN CONNECTIONIST Stephen J. Roberts & William Penny
Roberts, Stephen
d NOVELTY, CONFIDENCE & ERRORS IN CONNECTIONIST SYSTEMS Stephen J. Roberts & William Penny Neural, Technology & Medicine London, UK s.j.roberts@ic.ac.uk, w.penny@ic.ac.uk April 21, 1997 Abstract Key words
Neuropsychological mechanisms of interval timing behavior
Wilkinson, Gerald S.
perceiving a beat in a musical composition to returning to the stove just prior to the tea kettle whistling to expecting a traffic light to change from red to green. Furthermore, interval timing is exhibited in a wide
Boolean lattices as intervals in clone lattices
Krokhin, Andrei
Boolean lattices as intervals in clone lattices A.A.Krokhin Ural State University, Dept. of Algebra and Discrete Mathematics, Lenin av. 51, 620083, Ekaterinburg, Russia e-mail: Andrei.Krokhin@usu.ru Abstract All
Frequency domain design of interval controller
Park, Wunyong
1993-01-01
Subject: Electrical Engineering FREQUENCY DOMAIN DFSIGN OF INTERVAL CONTROLLER A Thesis by WUNYONG PARK Approved as to style and content by: S. P. Bhattacharyyd (Chair of Committee) C. N. Georghiades (Member) A. Datta (Member) S. Jayasuriya... (Member) . H. Keel (Member) A. Patton (Head of Department) May 1993 111 ABSTRACT Frequency Domain Design of Interval Controller. (May 1993) Wunyong Park, B. S. , Yon Sei University; M. S. , Yon Sei University Chair of Advisory Committee: Dr. S...
The effect of terrorism on public confidence : an exploratory study.
Berry, M. S.; Baldwin, T. E.; Samsa, M. E.; Ramaprasad, A.; Decision and Information Sciences
2008-10-31
A primary goal of terrorism is to instill a sense of fear and vulnerability in a population and to erode confidence in government and law enforcement agencies to protect citizens against future attacks. In recognition of its importance, the Department of Homeland Security includes public confidence as one of the metrics it uses to assess the consequences of terrorist attacks. Hence, several factors--including a detailed understanding of the variations in public confidence among individuals, by type of terrorist event, and as a function of time--are critical to developing this metric. In this exploratory study, a questionnaire was designed, tested, and administered to small groups of individuals to measure public confidence in the ability of federal, state, and local governments and their public safety agencies to prevent acts of terrorism. Data were collected from the groups before and after they watched mock television news broadcasts portraying a smallpox attack, a series of suicide bomber attacks, a refinery bombing, and cyber intrusions on financial institutions that resulted in identity theft and financial losses. Our findings include the following: (a) the subjects can be classified into at least three distinct groups on the basis of their baseline outlook--optimistic, pessimistic, and unaffected; (b) the subjects make discriminations in their interpretations of an event on the basis of the nature of a terrorist attack, the time horizon, and its impact; (c) the recovery of confidence after a terrorist event has an incubation period and typically does not return to its initial level in the long-term; (d) the patterns of recovery of confidence differ between the optimists and the pessimists; and (e) individuals are able to associate a monetary value with a loss or gain in confidence, and the value associated with a loss is greater than the value associated with a gain. These findings illustrate the importance the public places in their confidence in government and law enforcement and also indicate that the level of importance is clearly of a magnitude on the order of other major terrorist event consequences, such as loss of human life and impacts to the economy.
Setting confidence belts Byron P. Roe and Michael B. Woodroofe
Woodroofe, Michael B.
Setting confidence belts Byron P. Roe and Michael B. Woodroofe Department of Physics (B credible belts for the mean of a Poisson distribution in the presence of a background the Bayesian framework, these belts are optimal. The credible limits are then examined from a frequentist point
EU 'confident' of star power site By Jo Twist
option because of its position on the war in Iraq. Star power After the International Space Station, Iter stations, and would pave the way for commercial power production. In a fusion reaction, energy is producedEU 'confident' of star power site By Jo Twist BBC News Online science staff Europe is still
Confidence Estimation Methods for Partially Supervised Relation Extraction
Agichtein, Eugene
Confidence Estimation Methods for Partially Supervised Relation Extraction Eugene Agichtein is a family of partially-supervised re- lation extraction systems that require little manual training. However method on a variety of relations. 1 Overview Text documents convey valuable structured information
ORIGINAL PAPER Confidence levels for tsunami-inundation limits
Goldfinger, Chris
ORIGINAL PAPER Confidence levels for tsunami-inundation limits in northern Oregon inferred from / Accepted: 25 August 2009 Ó Springer Science+Business Media B.V. 2009 Abstract To explore the local tsunami coseismic deformations for simulation of tsunami inundation at Cannon Beach, Oregon. Maximum A brief summary
CONFIDENCE LIMITS FOR POPULATION PROJECTIONS WHEN VITAL RATES VARY RANDOMLY
CONFIDENCE LIMITS FOR POPULATION PROJECTIONS WHEN VITAL RATES VARY RANDOMLY TIM GERRODE, age distribution, and vital rates are known (e.g., Leslie 1945; Keyfitz 1968). Such population rates are available. However, there is uncertainty in such projections. First, we rarely know vital
VOLUMETRIC MODELING THROUGH FUSION OF MULTIPLE RANGE IMAGES WITH CONFIDENCE
Abidi, Mongi A.
VOLUMETRIC MODELING THROUGH FUSION OF MULTIPLE RANGE IMAGES WITH CONFIDENCE ESTIMATE A Thesis application using range images. A review of the volumetric modeling literature leads us to believe that we can to be as e cient with range images as other volumetric approaches. The second half of this thesis describes
2011-08 "Restore User Confidence in the Risk Analysis, Communication...
Office of Environmental Management (EM)
8 "Restore User Confidence in the Risk Analysis, Communication, Evaluation, and Reduction (RACER) Database" 2011-08 "Restore User Confidence in the Risk Analysis, Communication,...
Distributed Intersection Join of Complex Interval Sequences
Kriegel, Hans-Peter
Introduction After two decades of temporal and spatial index research, the efficient management of one be aggregated to an interval sequence, such as periods of "high" stock prices for technical chart analysis (cf simulations in virtual product environ- ments [8] or engineering data management can be supported
DIMACS Technical Report 200337 Perfect interval filament graphs
DIMACS Technical Report 200337 Perfect interval filament graphs by Fanica GAVRIL 1 DIMACS, Rutgers are disjoint, their curves do not intersect; FI={ f i | iÎI} is a family of interval filaments and its intersection graph is an interval filament graph. The interval filament graphs contain the polygon
Interval Data Systems Inc | Open Energy Information
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page on Google Bookmark EERE: Alternative Fuels Data Center Home Page on QA:QAsource History View NewTexas: Energy ResourcesOrder at 8, 13Renewable PowerMismatch | Open EnergyInterval Data
Inter-Korean military confidence building after 2003.
Tae-woo, Kim (Korea Institute for Defense Analyses, Seoul, Republic of Korea); Littlefield, Adriane C.; Vannoni, Michael Geoffrey; Sang-beom, Kim (Korea Institute for Defense Analyses, Seoul, Republic of Korea); Koelm, Jennifer Gay; Olsen, John Norman; Myong-jin, Kim (Korea Institute for Defense Analyses, Seoul, Republic of Korea); Sung-tack, Shin (Korea Institute for Defense Analyses, Seoul, Republic of Korea)
2003-08-01
Tensions on the Korean Peninsula remain high despite a long-term strategy by South Korea to increase inter-Korean exchanges in economics, culture, sports, and other topics. This is because the process of reconciliation has rarely extended to military and security topics and those initiatives that were negotiated have been ineffective. Bilateral interactions must include actions to reduce threats and improve confidence associated with conventional military forces (land, sea, and air) as well as nuclear, chemical, and biological activities that are applicable to developing and producing weapons of mass destruction (WMD). The purpose of this project is to develop concepts for inter-Korean confidence building measures (CBMs) for military and WMD topics that South Korea could propose to the North when conditions are right. This report describes the historical and policy context for developing security-related CBMs and presents an array of bilateral options for conventional military and WMD topics within a consistent framework. The conceptual CBMs address two scenarios: (1) improved relations where construction of a peace regime becomes a full agenda item in inter-Korean dialogue, and (2) continued tense inter-Korean relations. Some measures could be proposed in the short term under current conditions, others might be implemented in a series of steps, while some require a higher level of cooperation than currently exists. To support decision making by political leaders, this research focuses on strategies and policy options and does not include technical details.
Leveraging waveform complexity for confident detection of gravitational waves
Kanner, Jonah B; Cornish, Neil; Millhouse, Meg; Xhakaj, Enia; Salemi, Francesco; Drago, Marco; Vedovato, Gabriele; Klimenko, Sergey
2015-01-01
The recent completion of Advanced LIGO suggests that gravitational waves (GWs) may soon be directly observed. Past searches for gravitational-wave transients have been impacted by transient noise artifacts, known as glitches, introduced into LIGO data due to instrumental and environmental effects. In this work, we explore how waveform complexity, instead of signal-to-noise ratio, can be used to rank event candidates and distinguish short duration astrophysical signals from glitches. We test this framework using a new hierarchical pipeline that directly compares the Bayesian evidence of explicit signal and glitch models. The hierarchical pipeline is shown to have strong performance, and in particular, allows high-confidence detections of a range of waveforms at realistic signal-to-noise ratio with a two detector network.
Multiply Connected Topological Economics, Confidence Relation and Political Economy
Yi-Fang Chang
2010-02-14
Using the similar formulas of the preference relation and the utility function, we propose the confidence relations and the corresponding influence functions that represent various interacting strengths of different families, cliques and systems of organization. Since they can affect products, profit and prices, etc., in an economic system, and are usually independent of economic results, therefore, the system can produce a multiply connected topological economics. If the political economy is an economy chaperoned polity, it will produce consequentially a binary economy. When the changes of the product and the influence are independent one another, they may be a node or saddle point. When the influence function large enough achieves a certain threshold value, it will form a wormhole with loss of capital. Various powers produce usually the economic wormhole and various corruptions.
Measurable Maximal Energy and Minimal Time Interval
Eiman Abou El Dahab; Abdel Nasser Tawfik
2014-01-14
The possibility of finding the measurable maximal energy and the minimal time interval is discussed in different quantum aspects. It is found that the linear generalized uncertainty principle (GUP) approach gives a non-physical result. Based on large scale Schwarzshild solution, the quadratic GUP approach is utilized. The calculations are performed at the shortest distance, at which the general relativity is assumed to be a good approximation for the quantum gravity and at larger distances, as well. It is found that both maximal energy and minimal time have the order of the Planck time. Then, the uncertainties in both quantities are accordingly bounded. Some physical insights are addressed. Also, the implications on the physics of early Universe and on quantized mass are outlined. The results are related to the existence of finite cosmological constant and minimum mass (mass quanta).
Lin, Kevin Eradat, Jilbert B.S.; Mehta, Niraj H.; Bent, Chris; Lee, Steve P.; Apple, Sophia K.; Bassett, Lawrence W.
2008-11-15
Purpose: To examine, in a retrospective study, whether the initial posttreatment mammogram offers any benefit to patients. Methods and Materials: Patients were selected who had radiation after breast-conservation therapy from 1995 through 2005 and had follow-up mammography at University of California-Los Angeles (UCLA) within 1 year of completing radiotherapy. Results of the initial follow-up mammogram were analyzed to determine the yield of this initial mammogram. Results: Between 1995 and 2005, 408 patients treated with breast-conserving therapy and radiation had follow-up mammograms at UCLA within 1 year of completion of radiation. Median age at radiation completion was 56.9 years. Median interval between radiation and the initial mammogram was 3.1 months. Ten patients were found to have suspicious findings on the initial postradiation mammogram, prompting biopsy, but only 2 were found to have recurrent cancer. None of those lesions were palpable. In both cases the recurrences were ductal carcinoma in situ. Thus, the yield of the initial postoperative mammogram as compared with physical examination findings is estimated at 0.49 recurrences detected per 100 mammograms performed (95% confidence interval 0.059-1.759). Conclusions: The yield of the initial postradiation mammography at UCLA seems to be low, and only noninvasive carcinomas were found. Our data support the rationale to avoid the initial short-interval postradiation mammography and evaluate patients at 12 months.
Preliminaries Conserved Interval Distance between Non-trivial
Blin, Guillaume
Outline Preliminaries Results Conclusion Conserved Interval Distance between Non-trivial Genomes.Rizzi@unitn.it August the 16th Guillaume Blin, Romeo Rizzi Conserved Interval Distance between Non-trivial Genomes #12 Guillaume Blin, Romeo Rizzi Conserved Interval Distance between Non-trivial Genomes #12;Outline
Volatility return intervals analysis of the Japanese market
Jung, Woo-Sung; Havlin, Shlomo; Kaizoji, Taisei; Moon, Hie-Tae; Stanley, H Eugene
2007-01-01
We investigate scaling and memory effects in return intervals between price volatilities above a certain threshold $q$ for the Japanese stock market using daily and intraday data sets. We find that the distribution of return intervals can be approximated by a scaling function that depends only on the ratio between the return interval $\\tau$ and its mean $$. We also find memory effects such that a large (or small) return interval follows a large (or small) interval by investigating the conditional distribution and mean return interval. The results are similar to previous studies of other markets and indicate that similar statistical features appear in different financial markets. We also compare our results between the period before and after the big crash at the end of 1989. We find that scaling and memory effects of the return intervals show similar features although the statistical properties of the returns are different.
Increasing Confidence In Geothermal Heat Pump Design Methods
Shonder, John A; Hughes, Patrick
1998-03-01
Sizing the ground heat exchanger is one of the most important tasks in the design of a geothermal heat pump (GHP) system. Undersizing the heat exchanger can result in poor operating efficiency, reduced comfort, and nuisance heat pump lockouts on safety controls, while an oversized heat exchanger increases the installation cost of the system. The cost of ground loop installation may mean the difference between a feasible and an unfeasible project. Thus there are strong incentives to select heat exchanger lengths which allow satisfactory performance under all operating conditions within a feasible project budget. Sizing a ground heat exchanger is not a simple calculation. In the first place, there is usually some uncertainty in the peak block and annual space conditioning loads for the building to be served by the GHPs. The thermal properties of the soil formation may be unknown as well. Drilling logs and core samples can identify the soil type, but handbook values for the thermal properties of soils vary widely. Properly-done short-term on-site tests and data analysis to obtain thermal properties provide more accurate information, but since these tests are expensive they are usually only feasible in large projects. Given the uncertainties inherent in the process, if designers were truly working 'close to the edge' - selecting the absolute minimum heat exchanger length required to meet the predicted loads - one would expect to see more examples of undersized heat exchangers. Indeed there have been a few. However, over the past twenty years GHPs have been installed and successfully operated at thousands of locations all over the world. Conversations with customers and facility managers reveal a high degree of satisfaction with the technology, but studies of projects reveal far more cases of generously sized ground heat exchangers than undersized ones. This indicates that the uncertainties in space conditioning loads and soil properties are covered by a factor of safety. These conservative designs increase the installed cost of GHP systems, limiting their use and applicability. Moreover, as ground heat exchanger sizing methods have improved, they have suggested (and field tests are beginning to verify) that standard bore backfill practices lead to unnecessarily large ground heat exchangers. Growing evidence suggests that in many applications use of sand backfill with a grout plug at the surface, or use of bottom-to-top thermally enhanced grout, may provide groundwater protection equal to current practice at far less cost. Site tests of thermal properties provides more accurate information, but since these tests are expensive they are usually only performed in large projects. Even so, because soil properties can vary over a distance as small as a few feet, the value of these tests is limited. One objective of ongoing research at the Oak Ridge National Laboratory (ORNL) is to increase designers confidence in available ground heat exchanger sizing methods that lead to reliable yet cost-effective designs. To this end we have developed research-grade models that address the interactions between buildings, geothermal heat pump systems and ground heat exchangers The first application of these models was at Fort Polk, Louisiana, where the space conditioning systems of over 4,000 homes were replaced with geothermal heat pumps (Shonder and Hughes, 1997; Hughes et. al., 1997). At Fort Polk, the models were calibrated to detailed data from one of the residences. Data on the energy use of the heat pump, combined with inlet and outlet water temperature and flow rate in the ground heat exchangers, allowed us to determine the thermal properties of the soil formation being experienced by the operating GHP system. Outputs from the models provide all the data required by the various commercially-available ground loop sizing programs. Accurate knowledge of both the building loads and the soil properties eliminated the uncertainty normally associated with the design process, and allowed us to compare the predictions of the commercially-available
Scaling and memory in volatility return intervals in financial markets
Stanley, H. Eugene
markets, we study the return intervals between the daily volatilities of the price changes that are above.federal- reserve.gov releases H10 hist. We choose to study daily data records because there are intraday trendsScaling and memory in volatility return intervals in financial markets Kazuko Yamasaki* , Lev
Gaining Industrial Confidence for the Introduction of Domain-Specific Languages
Hooman, Jozef
Gaining Industrial Confidence for the Introduction of Domain-Specific Languages Arjan J. Mooij of using DSLs, in the industry there is also some reluctance against their introduction in product development. We address a number of issues that are important to gain industrial confidence
On performance bounds for interval Time Petri Nets Simona Bernardi
Bernardi, Simona
Ingenier´ia de Sistemas Universidad de Zaragoza, Spain jcampos@unizar.es Abstract Interval time Petri Nets and Technol- ogy and the project PERF of the Italian Ministry of Education, University and Research. tivities
Interval-valued Soft Constraint Problems , M. S. Pini1
Rossi, Francesca
preference intervals can be useful or necessary are energy trading and network traffic analysis [15], where the data information is usually incomplete or erroneous. In energy trading, costs may be im- precise since
Trace formulas for fourth order operators on unit interval, II
Andrey Badanin; Evgeny Korotyaev
2014-12-16
We consider self-adjoint fourth order operators on the unit interval with the Dirichlet type boundary conditions. For such operators we determine few trace formulas, similar to the case of Gelfand--Levitan formulas for second order operators.
On time-interval transformations in special relativity
A. V. Gopala Rao; K. S. Mallesh; K. N. Srinivasa Rao
2015-06-24
We revisit the problem of the Lorentz transformation of time-intervals in special relativity. We base our discussion on the time-interval transformation formula $ c\\Delta t' = \\gamma (c\\Delta t - \\vec{\\beta} \\cdot \\Delta \\vec{r}) $ in which $ \\Delta t'$ and $ \\Delta t $ are the time-intervals between a given pair of events, in two inertial frames $ S $ and $ S'$ connected by an general boost. We observe that the Einstein time-dilation-formula, the Doppler formula and the relativity of simultaneity, all follow when one the frames in the time-interval transformation formula is chosen as the canonical frame of the underlying event-pair. We also discuss the interesting special case $ \\Delta t' = \\gamma \\Delta t $ of the time-interval transformation formula obtained by setting $ \\vec{\\beta} \\cdot \\Delta \\vec{r}=0 $ in it and argue why it is really \\textbf{not} the Einstein time-dilation formula. Finally, we present some examples which involve material particles instead of light rays, and highlight the utility of time-interval transformation formula as a calculational tool in the class room.
Cremers, Daniel
¨at M¨unchen, Germany {dennis.mund,rudolph.triebel,daniel.cremers}@in.tum.de True Label = lightbulb 0 0 ->lightbulb Fig. 1: Object classification with active online Confidence Boosting: The image on the left
Robust Two-Step Confidence Sets, and the Trouble with the First Stage F-Statistic
Andrews, Isaiah
2014-09-04
When weak identification is a concern researchers frequently calculate confidence sets in two steps, first assessing the strength of identification and then, on the basis of this initial assessment, deciding whether to use ...
Doty, Sharon Lafferty
! " # $# ! % " & ' " ( ) ( * + ( Instill Customer Confidence Control Costs Manage Business Growth Portfolios Monitor Customer Satisfaction Manage Cost of Capital , * + * - . Advise and Consult Develop Staff to deliver outstanding service anywhere, anytime Values: Collaboration · Diversity · Excellence · Innovation
ESTIMATING PROPORTIONS WITH CONFIDENCE 19.1 a. 0.17.
Utts, Jessica
CHAPTER 19 ESTIMATING PROPORTIONS WITH CONFIDENCE 19.1 a. 0.17. b. 0.019. c. A 95% confidence will experience headaches while taking Seldane-D is between 13.2% and 20.8%. 19.2 a. The sample proportion that actually caused the side effect. 19.3 a. It means that the proportion answering "Yes, should" in the sample
Interval estimation in calibration problems: an alternate approach
Quaino, Oscar Rodolfo
1983-01-01
measurement of the dependent varia- ble. This interval will depend upon the dependent variable and also on the outcome of the calibration experiment. For each unknown a state- ment is made in the sense that it belongs to the interval. Then he searches... is computed as 2 n , 1(yi ? y) Ss n ? 2 and will also be denoted by NSE. (2. 2) In the calibration problem, the classical estimator of x* given an obsezvation y* is y* ? b 0 x* bl Under the normality assumption x" is the NLE of x* (Graybill 1976...
Flierl, Markus
DENOISING OF VOLUMETRIC DEPTH CONFIDENCE FOR VIEW RENDERING Srinivas Parthasarathy, Akul Chopra of Technology, Stockholm, Sweden ABSTRACT In this paper, we define volumetric depth confidence and pro- pose- perposition principle, we define a volumetric depth confidence de- scription of the underlying geometry
Interval eigenproblem in tropical and fuzzy algebra Tolerance eigenproblem in tropical algebra
Mitchener, Paul
Interval eigenproblem in tropical and fuzzy algebra Tolerance eigenproblem in tropical algebra Tolerance eigenproblem in fuzzy algebra Tolerance interval eigenvectors in tropical and fuzzy algebra Martin Workshop Birmingham, May 16, 2013 #12;Interval eigenproblem in tropical and fuzzy algebra Tolerance
Towards Reliable SubDivision of Geological Areas: Interval Approach
Kreinovich, Vladik
Towards Reliable SubDivision of Geological Areas: Interval Approach David D. Coblentz 1;2 , Vladik difficult to produce a reliable subdivision. The subdivision of a geological zone into segments is often, and often, we do not have a statistically sufficient amount of thoroughly analyzed geological samples
Towards Reliable SubDivision of Geological Areas: Interval Approach
Kreinovich, Vladik
Towards Reliable SubDivision of Geological Areas: Interval Approach David D. Coblentz 1;2 , Vladik Difficult to Produce a Reliable Subdivision The subdivision of a geological zone into segments is often the area, and often, we do not have a statistically sufficient amount of thoroughly analyzed geological
Interval methods for computing various refinements of Nash equilibria
Sainudiin, Raazesh
Interval methods for computing various refinements of Nash equilibria Bartlomiej Jacek Kubica, assumptions on their knowledge, ... Concepts: Dominant strategy equilibrium. The Nash equilibrium. The core of a game (for cooperative games). ... #12;Nash equilibrium Let the game (X1 ,...,Xn ;q1 ,...,qn
Beam casting implicit surfaces on the GPU with interval arithmetic
de Figueiredo, Luiz Henrique
Beam casting implicit surfaces on the GPU with interval arithmetic Francisco Ganacim, Luiz Henrique, Brazil Abstract--We present a GPU-based beam-casting method for rendering implicit surfaces in real time. INTRODUCTION Rendering surfaces with ray casting is perhaps the clearest example of a potentially
Proton aurora related to intervals of pulsations of diminishing periods
California at Berkeley, University of
Proton aurora related to intervals of pulsations of diminishing periods A. G. Yahnin,1 T. A are generated because of a cyclotron instability of the anisotropic distribution of ring current ions. Proton precipitation produced by the cyclotron instability can be responsible for proton aurora. Indeed
Inferring Positional Homologs with Common Intervals of Sequences
Chauve, Cedric
genes is an important problem in whole genomes comparisons, both for functional or evolu- tionary between genomes, also called po- sitional homologs, based on the conservation of the genomic context. We consider genomes represented by their gene order Â i.e. sequences of signed integers Â and common intervals
CIGAL: Common Intervals Global ALigner Guillaume Blin1
Chauve, Cedric
alignment [5]. We recycle this idea to align gene orders. Our data consist in two genomes represented by two sequences of signed identifiers. Those identifiers can be genes, gene families, or any other kind of genomic two genomes [3]. The problem of finding a maximal cover with a minimal num- ber of common intervals
Comparing Bacterial Genomes by Searching their Common Intervals
Fertin, Guillaume
Comparing Bacterial Genomes by Searching their Common Intervals SÂ´ebastien Angibaud, Damien. Comparing bacterial genomes implies the use of a dedicated measure. It relies on comparing circular genomes genomes that takes into account duplications. Its application on a concrete case, comparing E. coli and V
A multi-interval MBSC theory for active correlations technique
Tsyganov, Yu S
2015-01-01
The purpose of the paper is the development of the formalism for the treatment of rare events especially, when one applies active correlation method to suppress background products in the heavy ion induced complete fusion nuclear reactions. This formalism in fact is an extension of classical background signal combinations formalism for the case of time multi intervals.
Reverse Auction Bidding-Bid Time Intervals Analysis
Xiao, Mengyan
2015-05-11
the computer is bidding during the segment 1 of each section, it shall randomly select from 2 seconds to 23 seconds as its time-intervals. Rule 2, when the computer is bidding during the segment 2 of each section, it shall randomly select from 2 seconds to 9...
Karnowski, Thomas P. (Knoxville, TN); Tobin, Jr., Kenneth W. (Harriman, TN); Muthusamy Govindasamy, Vijaya Priya (Knoxville, TN); Chaum, Edward (Memphis, TN)
2012-07-10
A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.
CONFIDENCE MEASURES FOR HYBRID HMM/ANN SPEECH RECOGNITION Gethin Williams and Steve Renals
Edinburgh, University of
it is associated falls within some critical region and is accepted otherwise. In the case of a onetailed test, the acceptance and critical regions are delineated by a single threshold value of the test statistic. Two types for H 1 to be accepted. In order to carry out such a test, a test statistic is required. A confidence
CONFIDENCE MEASURES FOR HYBRID HMM/ANN SPEECH RECOGNITION Gethin Williams and Steve Renals
Edinburgh, University of
it is associated falls within some critical region and is accepted otherwise. In the case of a one-tailed test, the acceptance and critical regions are delineated by a single threshold value of the test statistic. Two types to be accepted. In order to carry out such a test, a test statistic is required. A confidence estimate
Retinal Vessel Extraction Using Multiscale Matched Filters, Confidence and Edge Measures
Bystroff, Chris
Retinal Vessel Extraction Using Multiscale Matched Filters, Confidence and Edge Measures Michal of improving detection of low-contrast and narrow vessels and eliminating false detections at non-vascular structures, a new technique is presented for extracting vessels in retinal images. The core of the technique
Quest-V: A Virtualized Multikernel for High-Confidence Systems
Quest-V: A Virtualized Multikernel for High-Confidence Systems Ye Li Boston University liye@cs.bu.edu Matthew Danish Boston University md@cs.bu.edu Richard West Boston University richwest@cs.bu.edu Abstract operating together as a dis- tributed system on a chip. Quest-V uses virtualization techniques to isolate
Using Classification to Evaluate the Output of ConfidenceBased Association Rule Mining
Frank, Eibe
Using Classification to Evaluate the Output of ConfidenceBased Association Rule Mining Stefan Hamilton, New Zealand {mhall, eibe}@cs.waikato.ac.nz Abstract. Association rule mining is a data mining concerning both running time and size of rule sets. 1 Introduction Association rule mining is a widely
Using Classification to Evaluate the Output of Confidence-Based Association Rule Mining
Frank, Eibe
Using Classification to Evaluate the Output of Confidence-Based Association Rule Mining Stefan, New Zealand {mhall, eibe}@cs.waikato.ac.nz Abstract. Association rule mining is a data mining concerning both running time and size of rule sets. 1 Introduction Association rule mining is a widely
Best-arm Identification Algorithms for Multi-Armed Bandits in the Fixed Confidence Setting
Nowak, Robert
Best-arm Identification Algorithms for Multi-Armed Bandits in the Fixed Confidence Setting Kevin with identifying the arm with the highest mean in a multi-armed bandit problem using as few independent samples from the arms as possible. While the so-called "best arm problem" dates back to the 1950s, only
Sunlight: Fine-grained Targeting Detection at Scale with Statistical Confidence
." Unfortunately, today's Web is a very dark and complex ecosystem driven to a large extent by the massive,riley,yannis,augustin,roxana,djhsu@cs.columbia.edu) ABSTRACT We present Sunlight, a system that detects the causes of target- ing phenomena on the web confidence. Today's web is growing increasingly complex and impenetrable as myriad of services collect
Tables for Trials and Failures with PD for Designated Confidence Level
Leach, Janice
2014-02-01
Two attachments are provided for performance testing of sensors and other Physical Protection System (PPS) components.#2; The first attachment is a table of Trials and Failures, giving Probability of Detection (PD) for a designated confidence level and sorted by trials.#2; The second attachment contains the same data, sorted by failures.
Estimation of confidence levels for physiology variables measured by a vital signs detection system
Estimation of confidence levels for physiology variables measured by a vital signs detection system Quantifying the accuracy of physiological data measured by a Vital Signs Detection System (VSDS) plays a key,2 The Vital Signs Detection System (VSDS) tested by the U.S. Army Research Institute of Environmental Medicine
Characterizing minimal interval completions: Towards better understanding of profile and pathwidth #
Todinca, Ioan
Characterizing minimal interval completions: Towards better understanding of profile and pathwidth # Pinar Heggernes + Karol Suchan #Â§ Ioan Todinca # Yngve Villanger + Abstract Minimal interval completions. An interval completion of a given graph is an interval supergraph of it on the same vertex set, obtained
Jagadeesan, Vikrant S.; Raleigh, David R.; Koshy, Matthew; Howard, Andrew R.; Chmura, Steven J.; Golden, Daniel W.
2014-01-01
Purpose: Students applying to radiation oncology residency programs complete 1 or more radiation oncology clerkships. This study assesses student experiences and perspectives during radiation oncology clerkships. The impact of didactic components and number of clerkship experiences in relation to confidence in clinical competency and preparation to function as a first-year radiation oncology resident are evaluated. Methods and Materials: An anonymous, Internet-based survey was sent via direct e-mail to all applicants to a single radiation oncology residency program during the 2012-2013 academic year. The survey was composed of 3 main sections including questions regarding baseline demographic information and prior radiation oncology experience, rotation experiences, and ideal clerkship curriculum content. Results: The survey response rate was 37% (70 of 188). Respondents reported 191 unique clerkship experiences. Of the respondents, 27% (19 of 70) completed at least 1 clerkship with a didactic component geared towards their level of training. Completing a clerkship with a didactic component was significantly associated with a respondent's confidence to function as a first-year radiation oncology resident (Wilcoxon rank–sum P=.03). However, the total number of clerkships completed did not correlate with confidence to pursue radiation oncology as a specialty (Spearman ? P=.48) or confidence to function as a first year resident (Spearman ? P=.43). Conclusions: Based on responses to this survey, rotating students perceive that the majority of radiation oncology clerkships do not have formal didactic curricula. Survey respondents who completed a clerkship with a didactic curriculum reported feeling more prepared to function as a radiation oncology resident. However, completing an increasing number of clerkships does not appear to improve confidence in the decision to pursue radiation oncology as a career or to function as a radiation oncology resident. These results support further development of structured didactic curricula for the radiation oncology clerkship.
Holographic Calculation for Large Interval Rényi Entropy at High Temperature
Bin Chen; Jie-qiang Wu
2015-06-10
In this paper, we study the holographic R\\'enyi entropy of a large interval on a circle at high temperature for the two-dimensional CFT dual to pure AdS$_3$ gravity. In the field theory, the R\\'enyi entropy is encoded in the CFT partition function on $n$-sheeted torus connected with each other by a large branch cut. As proposed in 1412.0763, the effective way to read the entropy in the large interval limit is to insert a complete set of state bases of the twist sector at the branch cut. Then the calculation transforms into an expansion of four-point functions in the twist sector with respect to $e^{-\\frac{2\\pi TR}{n}}$. By using the operator product expansion of the twist operators at the branch points, we read the first few terms of the R\\'enyi entropy, including the leading and next-leading contributions in the large central charge limit. Moreover, we show that the leading contribution is actually captured by the twist vacuum module. In this case by the Ward identity the four-point functions can be derived from the correlation function of four twist operators, which is related to double interval entanglement entropy. Holographically, we apply the recipe in 1303.7221 and 1306.4682 to compute the classical R\\'enyi entropy and its 1-loop quantum correction, after imposing a new set of monodromy conditions. The holographic classical result matches exactly with the leading contribution in the field theory up to $e^{-4\\pi TR}$ and $l^6$, while the holographical 1-loop contribution is in exact agreement with next-leading results in field theory up to $e^{-\\frac{6\\pi TR}{n}}$ and $l^4$ as well.
Fermilab | Newsroom | Press Releases | March 7, 2012: Tevatron...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
a Higgs boson between 147 and 179 GeVc2 at the 95-percent confidence level. Med Res | Hi Res Blocked from view until 3 a.m. CST. See here: http:tevnphwg.fnal.govresults...
Coronal inflows during the interval 1996-2014
Sheeley, N. R. Jr.; Wang, Y.-M. [Space Science Division, Naval Research Laboratory, Washington DC 20375-5352 (United States)
2014-12-10
We extend our previous counts of coronal inflows from the 5 yr interval 1996-2001 to the 18 yr interval 1996-2014. By comparing stackplots of these counts with similar stackplots of the source-surface magnetic field and its longitudinal gradient, we find that the inflows occur in long-lived streams with counting rates in excess of 18 inflows per day at sector boundaries where the gradient exceeds 0.22 G rad{sup –1}. These streams are responsible for the high (86%) correlation between the inflow rate and the longitudinal field gradient. The overall inflow rate was several times larger in sunspot cycle 23 than it has been so far in cycle 24, reflecting the relatively weak source-surface fields during this cycle. By comparison, in cycles 21-22, the source-surface field and its gradient had bursts of great strength, as if large numbers of inflows occurred during those cycles. We find no obvious relation between inflows and coronal mass ejections (CMEs) on timescales of days to weeks, regardless of the speeds of the CMEs, and only a 60% correlation on timescales of months, provided the CMEs are fast (V > 600 km s{sup –1}). We conclude that most of the flux carried out by CMEs is returned to the Sun via field line reconnection well below the 2.0 R {sub ?} inner limit of the LASCO field of view, and that the remainder accumulates in the outer corona for an eventual return at sector boundaries.
DIMACS Technical Report 2004-30 k-Interval-filament graphs
DIMACS Technical Report 2004-30 June 2004 k-Interval-filament graphs by Fanica GAVRIL1 DIMACS segment. In PL, above L, construct to each interval i(v)I a filament v connecting its two endpoints, such that for every two filaments u,v having uv and disjoint intervals i(u)filaments w with i
Ulrike Herzog
2009-02-28
We study an optimized measurement that discriminates two mixed quantum states with maximum confidence for each conclusive result, thereby keeping the overall probability of inconclusive results as small as possible. When the rank of the detection operators associated with the two different conclusive outcomes does not exceed unity we obtain a general solution. As an application, we consider the discrimination of two mixed qubit states. Moreover, for the case of higher-rank detection operators we give a solution for particular states. The relation of the optimized measurement to other discrimination schemes is also discussed.
Extending Sensor Calibration Intervals in Nuclear Power Plants
Coble, Jamie B.; Meyer, Ryan M.; Ramuhalli, Pradeep; Bond, Leonard J.; Shumaker, Brent; Hashemian, Hash
2012-11-15
Currently in the USA, sensor recalibration is required at every refueling outage, and it has emerged as a critical path item for shortening outage duration. International application of calibration monitoring, such as at the Sizewell B plant in UK, has shown that sensors may operate for eight years, or longer, within calibration tolerances. Online monitoring can be employed to identify those sensors which require calibration, allowing for calibration of only those sensors which need it. The US NRC accepted the general concept of online monitoring for sensor calibration monitoring in 2000, but no plants have been granted the necessary license amendment to apply it. This project addresses key issues in advanced recalibration methodologies and provides the science base to enable adoption of best practices for applying online monitoring, resulting in a public domain standardized methodology for sensor calibration interval extension. Research to develop this methodology will focus on three key areas: (1) quantification of uncertainty in modeling techniques used for calibration monitoring, with a particular focus on non-redundant sensor models; (2) accurate determination of acceptance criteria and quantification of the effect of acceptance criteria variability on system performance; and (3) the use of virtual sensor estimates to replace identified faulty sensors to extend operation to the next convenient maintenance opportunity.
Pattern Selection and Super-patterns in the Bounded Confidence Model
Ben-Naim, E
2015-01-01
We study pattern formation in the bounded confidence model of opinion dynamics. In this random process, opinion is quantified by a single variable. Two agents may interact and reach a fair compromise, but only if their difference of opinion falls below a fixed threshold. Starting from a uniform distribution of opinions with compact support, a traveling wave forms and it propagates from the domain boundary into the unstable uniform state. Consequently, the system reaches a steady state with isolated clusters that are separated by distance larger than the interaction range. These clusters form a quasi-periodic pattern where the sizes of the clusters and the separations between them are nearly constant. We obtain analytically the average separation between clusters L. Interestingly, there are also very small quasi-periodic modulations in the size of the clusters. The spatial periods of these modulations are a series of integers that follow from the continued fraction representation of the irrational average sepa...
Earning public trust and confidence: Requisites for managing radioactive wastes. Final report
Not Available
1993-11-01
The Task Force on Radioactive Waste Management was created in April 1991 by former Secretary James D. Watkins, who asked the group to analyze the critical institutional question of how the Department of Energy (DOE) might strengthen public trust and confidence in the civilian radioactive waste management program. The panel met eight times over a period of 27 months and heard formal presentations from nearly 100 representatives of state and local governments, non-governmental organizations, and senior DOE Headquarters and Field Office managers. The group also commissioned a variety of studies from independent experts, contracted with the National Academy of Sciences and the National Academy of Public Administration to hold workshops on designing and leading trust-evoking organizations, and carried out one survey of parties affected by the Department`s radioactive waste management activities and a second one of DOE employees and contractors.
Transits of planets with small intervals in circumbinary systems
Liu, Hui-Gen; Wang, Ying; Zhang, Hui; Zhou, Ji-Lin
2014-08-01
Transit times around single stars can be described well by a linear ephemeris. However, transit times in circumbinary systems are influenced both by the gravitational perturbations and the orbital phase variations of the central binary star. Adopting a coplanar analog of Kepler-16 as an example, we find that circumbinary planets can transit the same star more than once during a single planetary orbit, a phenomenon we call 'tight transits.' In certain geometric architecture, the projected orbital velocity of the planet and the secondary star can approach zero and change sign, resulting in very long transits and/or 2-3 transits during a single binary orbit. Whether tight transits are possible for a particular system depends primarily on the binary mass ratio and the orbital architecture of both the binary and the planet. We derive a time-dependent criterion to judge when tight transits are possible for any circumbinary system. These results are verified with full dynamical integrations that also reveal other tight transit characteristics, i.e., the transit durations and the intervals between tight transits. For the seven currently known circumbinary systems, we estimate these critical parameters both analytically and numerically. Due to the mutual inclination between the planet and the binary, tight transits can only occur across the less massive star B in Kepler-16, -34, -35, and -47 (for both planets). The long-term average frequency of tight transits (compared to typical transits) for Kepler-16, -34, and -35 are estimated to be several percent. Using full numerical integrations, the next tight transit for each system is predicted and the soonest example appears to be Kepler-47b and -47c, which are likely to have tight transits before 2025. These unique and valuable events often deserve special observational scrutiny.
Confidence building measures at sea:opportunities for India and Pakistan.
Vohra, Ravi Bhushan Rear Admiral; Ansari, Hasan Masood Rear Admiral
2003-12-01
The sea presents unique possibilities for implementing confidence building measures (CBMs) between India and Pakistan that are currently not available along the contentious land borders surrounding Jammu and Kashmir. This is due to the nature of maritime issues, the common military culture of naval forces, and a less contentious history of maritime interaction between the two nations. Maritime issues of mutual concern provide a strong foundation for more far-reaching future CBMs on land, while addressing pressing security, economic, and humanitarian needs at sea in the near-term. Although Indian and Pakistani maritime forces currently have stronger opportunities to cooperate with one another than their counterparts on land, reliable mechanisms to alleviate tension or promote operational coordination remain non-existent. Therefore, possible maritime CBMs, as well as pragmatic mechanisms to initiate and sustain cooperation, require serious examination. This report reflects the unique joint research undertaking of two retired Senior Naval Officers from both India and Pakistan, sponsored by the Cooperative Monitoring Center of the International Security Center at Sandia National Laboratories. Research focuses on technology as a valuable tool to facilitate confidence building between states having a low level of initial trust. Technical CBMs not only increase transparency, but also provide standardized, scientific means of interacting on politically difficult problems. Admirals Vohra and Ansari introduce technology as a mechanism to facilitate consistent forms of cooperation and initiate discussion in the maritime realm. They present technical CBMs capable of being acted upon as well as high-level political recommendations regarding the following issues: (1) Delimitation of the maritime boundary between India and Pakistan and its relationship to the Sir Creek dispute; (2) Restoration of full shipping links and the security of ports and cargos; (3) Fishing within disputed areas and resolution of issues relating to arrest and repatriation of fishermen from both sides; and (4) Naval and maritime agency interaction and possibilities for cooperation.
Haas, Zygmunt J.
scheme out-performs other backoff schemes, such as binary exponential backoff (BEB) and multiplicative Terms--Backoff algorithm, backoff interval, binary ex- ponential backoff (BEB), multiplicative increase
On the need and use of models to explore the role of economic confidence:a survey.
Sprigg, James A.; Paez, Paul J.; Hand, Michael S.
2005-04-01
Empirical studies suggest that consumption is more sensitive to current income than suggested under the permanent income hypothesis, which raises questions regarding expectations for future income, risk aversion, and the role of economic confidence measures. This report surveys a body of fundamental economic literature as well as burgeoning computational modeling methods to support efforts to better anticipate cascading economic responses to terrorist threats and attacks. This is a three part survey to support the incorporation of models of economic confidence into agent-based microeconomic simulations. We first review broad underlying economic principles related to this topic. We then review the economic principle of confidence and related empirical studies. Finally, we provide a brief survey of efforts and publications related to agent-based economic simulation.
Developing information-space Confidence Building Measures (CBMs) between India and Pakistan
Yamin, Tughral
2014-06-01
The Internet has changed the world in ways hitherto unknown. The international financial system, air, land and maritime transport systems are all digitally linked. Similarly most militaries are fully or partially networked. This has not only sped up the decision making processes at all levels, it has also rendered these systems vulnerable to cyber-attacks. Cyber-warfare is now recognized as the most potent form of non-kinetic war fighting. In order to prevent large scale network-attacks, cyber-powers are simultaneously spending a lot of time, money and effort to erect redundant cyber-defenses and enhancing their offensive cyber capabilities. Difficulties in creating a stable environment in information-space stem from differing national perceptions regarding the freedom of the Internet, application of international law and problems associated with attribution. This paper discusses a range of Confidence Building Measures that can be created between India and Pakistan in information-space to control malicious cyber behavior and avert an inadvertent war.
Effects of perceptual load on startle reflex modification at a long lead interval
Effects of perceptual load on startle reflex modification at a long lead interval GARY L. THORNE Abstract Inhibition of the startle eyeblink response at long lead intervals has been hypothesized to occur the lead and startle stimuli are in different modalities under conditions of high perceptual load
Interval Methods for Sensitivity-Based Model-Predictive Control of
Kearfott, R. Baker
Interval Methods for Sensitivity-Based Model-Predictive Control of Solid Oxide Fuel Cell Systems and experiment for the thermal subprocess of a high-temperature solid oxide fuel cell system. Keywords: Interval analysis, model-predictive control, sensitivity analysis, tracking control, solid oxide fuel cells AMS
Regression Models with Interval Censoring Jian Huang and Jon A. Wellner 1
Wellner, Jon A.
Regression Models with Interval Censoring Jian Huang and Jon A. Wellner 1 University of Washington October 6, 1993 Abstract In this paper we discuss estimation in semiparametric regression models with interval censoring, with emphasis on estimation of the regression parameter . The first section surveys
LYAPUNOV AND SACKER-SELL SPECTRAL INTERVALS LUCA DIECI AND ERIK S. VAN VLECK
Van Vleck, Erik S.
LYAPUNOV AND SACKER-SELL SPECTRAL INTERVALS LUCA DIECI AND ERIK S. VAN VLECK AbstractÂ the Lyapunov spectral intervals. Since any bounded and continuous coeficient matrix function can be smoothly/or continuous Lyapunov spectrum. Key words. Exponential dichotomy, Sacker-Sell spectrum, Lyapunov exponents
Scaling and memory of intraday volatility return intervals in stock markets Fengzhong Wang,1
Stanley, H. Eugene
Scaling and memory of intraday volatility return intervals in stock markets Fengzhong Wang,1 Kazuko interval between price volatilities that are above a certain threshold q for 31 intraday data sets Yamasaki,1,2 Shlomo Havlin,1,3 and H. Eugene Stanley1 1 Center for Polymer Studies and Department
Sericola, Bruno
286 IEEE TRANSACTIONS ON COMPUTERS, VOL. 44, NO. 2. FEBRUARY 1995 Interval Availability Analysis Gerard0 Rubino and Bruno Sericola Abstiact-Interval availability is a dependability measure de- fined availability level is high enough. The system is assumed to be modeled as a Markov process with countable state
Smith, Susan Marilyn Hartman
1977-01-01
- ~uivocal feedback on individuals' abilities, women do not have lower self-confidence than men. Feather and. Simon (1971) found no sex differences in conf1dence of passing a subsequent anagruns test when the subjects had been given feedback in the form oi..., 94 ) I ? 5. 28, yg . 01. A Scheffe's comparison of means revealed that subjects who were given clear feedback were significantly more conf1dent in same-sex competition (X= 4. 26) and less confident in oppos1te-sex competition (X= 2. 71), g g . 05...
on the quality of GPS measurements and its variability. The manuals of most GPS units provide a rough theoreticalIntegrated approach to predict confidence of GPS measurement Massoud Sharif a, A. Stein a, Ernst M, Acquisition, Transformation, GPS, Reference Data, Accuracy, Observations ABSTRACT Code measurement hand
Statistical Models for Solar Flare Interval Distribution in Individual Active Regions
Yuki Kubo
2008-02-01
This article discusses statistical models for solar flare interval distribution in individual active regions. We analyzed solar flare data in 55 active regions that are listed in the GOES soft X-ray flare catalog. We discuss some problems with a conventional procedure to derive probability density functions from any data set and propose a new procedure, which uses the maximum likelihood method and Akaike Information Criterion (AIC) to objectively compare some competing probability density functions. We found that lognormal and inverse Gaussian models are more likely models than the exponential model for solar flare interval distribution in individual active regions. The results suggest that solar flares do not occur randomly in time; rather, solar flare intervals appear to be regulated by solar flare mechanisms. We briefly mention a probabilistic solar flare forecasting method as an application of a solar flare interval distribution analysis.
Effinger, J.
2011-01-01
The use of whole building utility interval data for verifying energy savings from energy efficiency projects has become an attractive option as this data is increasingly available. Formal protocols, such as IPMVP Option C and ASHRAE Guideline 14...
Computing Best Possible Pseudo-Solutions to Interval Linear Systems of Equations
Kearfott, R. Baker
. Panyukov South Ural State University, Chelyabinsk, Russia a_panyukov@mail.ru Valentin A. Golodov South Ural State University, Chelyabinsk, Russia avaksa@gmail.com Abstract In the paper, we consider interval
Monitoring molecular interactions using photon arrival-time interval distribution analysis
Laurence, Ted A. (Livermore, CA); Weiss, Shimon (Los Angels, CA)
2009-10-06
A method for analyzing/monitoring the properties of species that are labeled with fluorophores. A detector is used to detect photons emitted from species that are labeled with one or more fluorophores and located in a confocal detection volume. The arrival time of each of the photons is determined. The interval of time between various photon pairs is then determined to provide photon pair intervals. The number of photons that have arrival times within the photon pair intervals is also determined. The photon pair intervals are then used in combination with the corresponding counts of intervening photons to analyze properties and interactions of the molecules including brightness, concentration, coincidence and transit time. The method can be used for analyzing single photon streams and multiple photon streams.
A Recipe for Construction of the Critical Vertices for Left-Sector Stability of Interval
A Recipe for Construction of the Critical Vertices for Left-Sector Stability of Interval polynomials. This paper provides a recipe for construction of these critical vertices. Illustrative examples
Interval Computations as an Important Part of Granular Computing: An Introduction
Kreinovich, Vladik
interval techniques, and we list a few typical applications of these techniques. 2 Why Computations the best way to change this flow (e.g., by building dams or levees) is a problem of engineering
Robust stabilizer synthesis for interval plants using H-Infinity methods
Bhattacharya, Saikat
1993-01-01
OF SCIENCE August 1993 Major Subject: Electrical Engineering ROBUST STABILIZER SYNTHESIS FOR INTERVAL PLANTS USING H-INFINITY METHODS A Thesis by SAIKAT BHATTACHARYA Approved as to style and content by: S. P. Bhattacharyya (Chair of Committee) J. W.... (Hons. ), Indian Institute of Technology, Khsragpur, India Chair of Advisory Committee: Dr. S. P. Bhattacharyya The aim of this research has been to develop a synthesis method for the robust stabilization of interval plants. First, the biggest...
Vannoni, M.; Duggan, R. [Sandia National Labs., Albuquerque, NM (United States). Cooperative Monitoring Center; Nam, M.K.; Moon, K.K.; Kim, M.J. [Korea Inst. for Defense Analyses, Seoul (Korea, Republic of). Arms Control Research Center
1997-04-01
Confidence building measures (CBMs), particularly military ones, that address the security needs of North and South Korea could decrease the risk of conflict on the Korean Peninsula and help create an environment in which to negotiate a peace regime. The Korea Institute for Defense Analyses (KIDA) and the Cooperative Monitoring Center (CMC) of Sandia National Laboratories collaborated to identify potential CBMs and define associated monitoring. The project is a conceptual analysis of political and technical options for confidence building that might be feasible in Korea at some future time. KIDA first analyzed current security conditions and options for CBMs. Their conclusions are presented as a hypothetical agreement to strengthen the Armistice Agreement by establishing Limited Force Deployment Zones along the Military Demarcation Line. The goal of the hypothetical agreement is to increase mutual security and build confidence. The CMC then used KIDA`s scenario to develop a strategy for cooperative monitoring the agreement. Cooperative monitoring is the collecting, analyzing and sharing of agreed information among parties to an agreement and typically relies on the use of commercially available technology. A cooperative monitoring regime must be consistent with the agreement`s terms; the geographic, logistic, military, and political factors in the Korean setting; and the capabilities of monitoring technologies. This report describes the security situation on the Korean peninsula, relevant precedents from other regions, the hypothetical agreement for reducing military tensions, a monitoring strategy for the hypothetical Korean agreement, examples of implementation, and a description of applicable monitoring technologies and procedures.
M. Kumar; S. Sahoo
2012-07-02
Time interval between the incident and scattered photon in Raman effect and absorption of photon and emission of electron in photoelectric effect has not been determined till now. This is because there is no such high level instrument discovered till now to detect time interval to such a small level. But this can be calculated theoretically by applying a basic principle of physics like impulse is equal to the change in momentum. Considering the collision between electron and photon as perfect inelastic collision in photoelectric effect, elastic and inelastic collision in Raman effect and elastic collision in plane mirror reflection and the interaction between electron and photon as strong gravitational interaction we calculate the required time interval. During these phenomena there is lattice vibration which can be quantized as phonon particles.
Distribution of Primes and of Interval Prime Pairs Based on $?$ Function
Yifang Fan; Zhiyu Li
2010-04-19
$\\Theta$ function is defined based upon Kronecher symbol. In light of the principle of inclusion-exclusion, $\\Theta$ function of sine function is used to denote the distribution of composites and primes. The structure of Goldbach Conjecture has been analyzed, and $\\Xi$ function is brought forward by the linear diophantine equation; by relating to $\\Theta$ function, the interval distribution of composite pairs and prime pairs (i.e. the Goldbach Conjecture) is thus obtained. In the end, Abel's Theorem (Multiplication of Series) is used to discuss the lower limit of the distribution of the interval prime pairs.
Interval Methods in Remote Sensing: Reliable SubDivision of Geological Areas
Ward, Karen
Interval Methods in Remote Sensing: Reliable SubDivision of Geological Areas David D. Coblentz, G. The subdivision of a geological zone into segments is often a controversial issue, with different evidence of the geological subdivision is the fact that the existing subdivision is often based on the chemical and physical
Estimation of shear-wave interval attenuation from mode-converted data Bharath Shekar1
Tsvankin, Ilya
Tsvankin1 ABSTRACT Interval attenuation measurements provide valuable infor- mation for reservoir characterization and lithology discrimi- nation. We extend the attenuation layer-stripping method of Behura of the material (Prasad and Nur, 2003), the pre- sence of aligned fluid-filled fractures (Chapman, 2003; Batzle et
Menzel, Randolf - Institut für Biologie
2007-01-01
Stationary spiking of single neurons is often modelled by a renewal point process. Here, we tested are frequently used as models for neuronal spiking. Renewal processes are a simple and well-studied class (i.i.d.) according to a fixed interval distribution [10]. Renewal models may be defined in abstract
Ungar, Lyle H.
prediction lim- its for ANN's: a frequentist approach, based on stan- dard non-linear regression theory for estimating the prediction uncertainties of non- linear regressionseee.g Seberand Wild, 1989, based on localEstimating Prediction Intervals for Arti cial Neural Networks Lyle H. Ungar Richard D. De Veaux
Ungar, Lyle H.
to obtaining prediction lim its for ANN's: a frequentist approach, based on stan dard nonlinear regression of the prediction intervals, their computational costs and practical implementa tion issues of the two approaches of as doing nonlinear regression. Standard methods ex ist for estimating the prediction uncertainties of non
Barranco, Bernabe Linares
2007-01-01
Neurocomputing 70 (2007) 26922700 Inter-spike-intervals analysis of AER Poisson-like generator Sevilla, Spain Available online 10 May 2007 Abstract AddressEventRepresentation (AER) is a communication). In developing AER-based systems it is very convenient to have available some means of generating AER streams
Lummaa, Virpi
by a shorter waiting time to the first birth (first birth interval, FBI), are able to afford higher costs was divided into tertiles based on the length of FBI. Results: Women with the shortest FBI had a higher number.06). Women who had ever given birth to twins had shorter FBI than women of singletons (20.1 and 26.1 months
Uncertainty in Risk Analysis: Towards a General SecondOrder Approach Combining Interval,
Kreinovich, Vladik
Uncertainty in Risk Analysis: Towards a General SecondOrder Approach Combining Interval important in risk analysis. A natural way to describe this uncer tainty is to describe a set of possible methods of handling such partial information in risk analysis. Several such techniques have been presented
Proper Oil Sampling Intervals and Sample Collection Techniques Gasoline/Diesel/Natural Gas Engines
Proper Oil Sampling Intervals and Sample Collection Techniques Gasoline/Diesel/Natural Gas Engines: · Oil samples can be collected during oil changes. Follow manufacturers recommendations on frequency (hours, mileage, etc) of oil changes. · Capture a sample from the draining oil while the oil is still hot
Weak -nets and interval chains Noga Alon, Haim Kaplan, Gabriel Nivasch1
Shamir, Ron
Weak -nets and interval chains Noga Alon, Haim Kaplan, Gabriel Nivasch1 , Micha Sharir, Shakhar@post.tau.ac.il, gnivasch@post.tau.ac.il, michas@post.tau.ac.il, shakhar@courant.nyu.edu. Noga Alon, Haim Kaplan, GabrielMINERVA Center for Geometry at Tel Aviv University. Work by Haim Kaplan was partially supported by ISF Grant 975
Statistical properties of heartbeat intervals during atrial fibrillation Wanzhen Zeng and Leon Glass
Glass, Leon
Statistical properties of heartbeat intervals during atrial fibrillation Wanzhen Zeng and Leon node which provides an electrical pathway between the atria and the main pumping chambers of the heart in cardiology that docu- ments the statistical properties of the ventricular activity dur- ing atrial
TSAR: A Two Tier Sensor Storage Architecture Using Interval Skip Graphs
Ganesan, Deepak
metadata by employing local archiving at the sensors and distributed indexing at the proxies. At the proxyTSAR: A Two Tier Sensor Storage Architecture Using Interval Skip Graphs Peter Desnoyers, Deepak pjd@cs.umass.edu, dganesan@cs.umass.edu, shenoy@cs.umass.edu ABSTRACT Archival storage of sensor data
Stress evolution of the San Andreas fault system: Recurrence interval versus locking depth
Smith-Konter, Bridget
Stress evolution of the San Andreas fault system: Recurrence interval versus locking depth Bridget by stress that has accumulated in the upper locked portion of the crust. The present-day stress accumulation rate on any given fault segment is fairly well resolved by current geodetic measurements. Model stress
LYAPUNOV SPECTRAL INTERVALS: THEORY AND COMPUTATION LUCA DIECI y AND ERIK S. VAN VLECK z
Van Vleck, Erik S.
LYAPUNOV SPECTRAL INTERVALS: THEORY AND COMPUTATION #3; LUCA DIECI y AND ERIK S. VAN VLECK z dichotomy of Sacker and Sell and the spectrum de#12;ned in terms of upper and lower Lyapunov exponents information. Finally, we discuss the algorithms we have used to approximate the Lyapunov and Sacker
Huang, Yinlun
Sustainable distributed biodiesel manufacturing under uncertainty: An interval A sophisticated biodiesel manufacturing study demonstrated methodological efficacy. a r t i c l e i n f o Article Simulation Uncertainty a b s t r a c t Biodiesel, a clean-burning alternative fuel, can be produced using
Two intervals R\\'enyi entanglement entropy of compact free boson on torus
Liu, Feihu
2015-01-01
We compute the $N=2$ R\\'enyi entanglement entropy of two intervals at equal time in a circle, for the theory of a 2d compact complex free scalar at finite temperature. This is carried out by performing functional integral on a genus 3 ramified cover of the torus, wherein the quantum part of the integral is captured by the four point function of twist fields on the worldsheet torus, and the classical piece is given by summing over winding modes of the genus 3 surface onto the target space torus. The final result is given in terms of a product of theta function and certain multi-dimensional theta function. We demonstrate the T-duality invariance of the result. We also study its low temperature limit. In the case in which the size of the intervals and of their separation are much smaller than the whole system, our result is in exact agreement with the known result for two intervals on an infinite system at zero temperature \\cite{eeoftwo}. In the case in which the separation between the two intervals is much smal...
Haney, Elizabeth anne
1998-01-01
approaching fatigue on d 2 and 5. On interval work days, 'ton-board'' heart rate monitors were used to record heart rates through a series of anaerobic maneuver repetitions with 1 min of recovery in between repetitions. When peak and recovery heart rates...
A Genealogy for Finite Kneading Sequences of Bimodal Maps on the Interval
John Ringland; Charles Tresser
1993-07-20
We generate all the finite kneading sequences of one of the two kinds of bimodal map on the interval, building each sequence uniquely from a pair of shorter ones. There is a single pair at generation 0, with members of length 1. Concomitant with this genealogy of kneading sequences is a unified genealogy of all the periodic orbits. (6/93)
VARIABILITY OF SOLAR RADIATION DATA OVER SHORT TIME INTERVALS Frank Vignola
Oregon, University of
VARIABILITY OF SOLAR RADIATION DATA OVER SHORT TIME INTERVALS Frank Vignola Department of Physics ra- diation. This article examines at the variability of beam and global solar radiation over short solar radiation values with ground-based data. 1. INTRODUCTION It is difficult to evaluate solar
Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.; Amidan, Brett G.
2013-04-27
This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0 3. quantitative data (e.g., contaminant concentrations expressed as CFU/cm2) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 4. quantitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0. For Situation 2, the hotspot sampling approach provides for stating with Z% confidence that a hotspot of specified shape and size with detectable contamination will be found. Also for Situation 2, the CJR approach provides for stating with X% confidence that at least Y% of the decision area does not contain detectable contamination. Forms of these statements for the other three situations are discussed in Section 2.2. Statistical methods that account for FNR > 0 currently only exist for the hotspot sampling approach with qualitative data (or quantitative data converted to qualitative data). This report documents the current status of methods and formulas for the hotspot and CJR sampling approaches. Limitations of these methods are identified. Extensions of the methods that are applicable when FNR = 0 to account for FNR > 0, or to address other limitations, will be documented in future revisions of this report if future funding supports the development of such extensions. For quantitative data, this report also presents statistical methods and formulas for 1. quantifying the uncertainty in measured sample results 2. estimating the true surface concentration corresponding to a surface sample 3. quantifying the uncertainty of the estimate of the true surface concentration. All of the methods and formulas discussed in the report were applied to example situations to illustrate application of the methods and interpretation of the results.
Reader, Simon
volumes. Innovation frequencies also correlated with laboratory measures of learning, increasing our confidence in the innovation measure, and with social learning frequencies, suggesting that innovation and social learning propensities have evolved together. Species range size did not correlate
Kim, Su Yeon
2011-08-08
This study explored beginning and advanced pre-service teachers’ Internet use and their experience, confidence, and competence in using new literacies of the Internet. In addition, this study compared the pre-service ...
Interval Data Analysis with the Energy Charting and Metrics Tool (ECAM)
Taasevigen, Danny J.; Katipamula, Srinivas; Koran, William
2011-07-07
Analyzing whole building interval data is an inexpensive but effective way to identify and improve building operations, and ultimately save money. Utilizing the Energy Charting and Metrics Tool (ECAM) add-in for Microsoft Excel, building operators and managers can begin implementing changes to their Building Automation System (BAS) after trending the interval data. The two data components needed for full analyses are whole building electricity consumption (kW or kWh) and outdoor air temperature (OAT). Using these two pieces of information, a series of plots and charts and be created in ECAM to monitor the buildings performance over time, gain knowledge of how the building is operating, and make adjustments to the BAS to improve efficiency and start saving money.
-up is the murderer from having seen the crime, and then learns of the substantial experimental psychology evidence's confidence on a specific case of the same kind of question. What if you're very confident that the murderer
CP(N-1) model on finite interval in the large N limit
A. Milekhin
2012-07-02
The CP(N-1) \\sigma\\ model on finite interval of length R with Dirichlet boundary conditions is analysed in the 1/N expansion. The theory has two phases, separated by a phase transition at R ~ 1/\\Lambda, \\Lambda\\ is dynamical scale of the CP(N-1) model. The vacuum energy dependence of R, and especially Casimir-type scaling 1/R, is discussed.
Pan Danguang; Gao Yanhua; Song Junlei [School of Civil and Environmental Engineering, University of Science and Technology Beijing, Beijing, 100083 (China)
2010-05-21
A new analysis technique, called multi-level interval estimation method, is developed for locating damage in structures. In this method, the artificial neural networks (ANN) analysis method is combined with the statistics theory to estimate the range of damage location. The ANN is multilayer perceptron trained by back-propagation. Natural frequencies and modal shape at a few selected points are used as input to identify the location and severity of damage. Considering the large-scale structures which have lots of elements, multi-level interval estimation method is developed to reduce the estimation range of damage location step-by-step. Every step, estimation range of damage location is obtained from the output of ANN by using the method of interval estimation. The next ANN training cases are selected from the estimation range after linear transform, and the output of new ANN estimation range of damage location will gained a reduced estimation range. Two numerical example analyses on 10-bar truss and 100-bar truss are presented to demonstrate the effectiveness of the proposed method.
Byron P. Roe; Michael B. Woodroofe
2000-10-13
We propose using a Bayes procedure with uniform improper prior to determine credible belts for the mean of a Poisson distribution in the presence of background and for the continuous problem of measuring a non-negative quantity $\\theta$ with a normally distributed measurement error. Within the Bayesian framework, these belts are optimal. The credible limits are then examined from a frequentist point of view and found to have good frequentist and conditional frequentist properties.
Ramanathan, Nithya
2008-01-01
the arsenic problem in bangladesh. doctoral thesis, divisiondeployments, undertaken in Bangladesh in January, 2006.in a rice paddy in Bangladesh to help scientists evaluate
- and long-lead-interval modification of the acoustic startle eyeblink response: comparing auditory that modification of startle by lead stimuli with short- and long-lead-intervals is modulated by stimulus significance. The significant stimulus in a tone duration judgement task generates enhanced short-lead
Kobourov, Stephen G.
Weak Unit Disk and Interval Representation of Graphs M. J. Alam, S. G. Kobourov, S. Pupyrev, and J of intersection representations with unit balls: unit disks in the plane and unit intervals on the line. Given is to represent the vertices of the graph by unit-size balls so that the balls for two adjacent vertices intersect
Nam, Man-Kwon; Shin, Sung-Tack
1999-06-01
Nuclear energy continues to be a strong and growing component of economic development in Northeast Asia. A broad range of nuclear energy systems already exists across the region and vigorous growth is projected. Associated with these capabilities and plans are various concerns about operational safety, environmental protection, and accumulation of spent fuel and other nuclear materials. We consider cooperative measures that might address these concerns. The confidence building measures suggested here center on the sharing of information to lessen concerns about nuclear activities or to solve technical problems. These activities are encompassed by an Enhanced Nuclear Transparency in Northeast Asia (ENTNEA) concept that would be composed of near-term, information-sharing activities and an eventual regional institution. The near-term activities would address specific concerns and build a tradition of cooperation; examples include radiation measurements for public safety and emergency response, demonstration of safe operations at facilities and in transportation, and material security in the back end of the fuel cycle. Linkages to existing efforts and organizations would be sought to maximize the benefits of cooperation. In the longer term, the new cooperative tradition might evolve into an ENTNEA institution. In institutional form, ENTNEA could combine the near-term activities and new cooperative activities, which might require an institutional basis, for the mutual benefit and security of regional parties.
Panek, Petr; Prochazka, Ivan [Institute of Photonics and Electronics, Academy of Sciences of the Czech Republic, Chaberska 57, 182 51 Prague (Czech Republic)
2007-09-15
This article deals with the time interval measurement device, which is based on a surface acoustic wave (SAW) filter as a time interpolator. The operating principle is based on the fact that a transversal SAW filter excited by a short pulse can generate a finite signal with highly suppressed spectra outside a narrow frequency band. If the responses to two excitations are sampled at clock ticks, they can be precisely reconstructed from a finite number of samples and then compared so as to determine the time interval between the two excitations. We have designed and constructed a two-channel time interval measurement device which allows independent timing of two events and evaluation of the time interval between them. The device has been constructed using commercially available components. The experimental results proved the concept. We have assessed the single-shot time interval measurement precision of 1.3 ps rms that corresponds to the time of arrival precision of 0.9 ps rms in each channel. The temperature drift of the measured time interval on temperature is lower than 0.5 ps/K, and the long term stability is better than {+-}0.2 ps/h. These are to our knowledge the best values reported for the time interval measurement device. The results are in good agreement with the error budget based on the theoretical analysis.
The Entanglement Renyi Entropies of Disjoint Intervals in AdS/CFT
Thomas Faulkner
2013-03-28
We study entanglement Renyi entropies (EREs) of 1+1 dimensional CFTs with classical gravity duals. Using the replica trick the EREs can be related to a partition function of n copies of the CFT glued together in a particular way along the intervals. In the case of two intervals this procedure defines a genus n-1 surface and our goal is to find smooth three dimensional gravitational solutions with this surface living at the boundary. We find two families of handlebody solutions labelled by the replica index n. These particular bulk solutions are distinguished by the fact that they do not spontaneously break the replica symmetries of the boundary surface. We show that the regularized classical action of these solutions is given in terms of a simple numerical prescription. If we assume that they give the dominant contribution to the gravity partition function we can relate this classical action to the EREs at leading order in G_N. We argue that the prescription can be formulated for non-integer n. Upon taking the limit n -> 1 the classical action reproduces the predictions of the Ryu-Takayanagi formula for the entanglement entropy.
Scaling and memory of intraday volatility return intervals in stock market
Wang, F; Stanley, H E; Yamasaki, K; Havlin, Shlomo; Wang, Fengzhong; Yamasaki, Kazuko
2006-01-01
We study the return interval $\\tau$ between price volatilities that are above a certain threshold $q$ for 31 intraday datasets, including the Standard & Poor's 500 index and the 30 stocks that form the Dow Jones Industrial index. For different threshold $q$, the probability density function $P_q(\\tau)$ scales with the mean interval $\\bar{\\tau}$ as $P_q(\\tau)={\\bar{\\tau}}^{-1}f(\\tau/\\bar{\\tau})$, similar to that found in daily volatilities. Since the intraday records have significantly more data points compared to the daily records, we could probe for much higher thresholds $q$ and still obtain good statistics. We find that the scaling function $f(x)$ is consistent for all 31 intraday datasets in various time resolutions, and the function is well approximated by the stretched exponential, $f(x)\\sim e^{-a x^\\gamma}$, with $\\gamma=0.38\\pm 0.05$ and $a=3.9\\pm 0.5$, which indicates the existence of correlations. We analyze the conditional probability distribution $P_q(\\tau|\\tau_0)$ for $\\tau$ following a certa...
Neep, Michael J; Steffens, Tom; Owen, Rebecca; McPhail, Steven M
2014-06-15
The provision of a written comment on traumatic abnormalities of the musculoskeletal system detected by radiographers can assist referrers and may improve patient management, but the practice has not been widely adopted outside the United Kingdom. The purpose of this study was to investigate Australian radiographers' perceptions of their readiness for practice in a radiographer commenting system and their educational preferences in relation to two different delivery formats of image interpretation education, intensive and non-intensive. A cross-sectional web-based questionnaire was implemented between August and September 2012. Participants included radiographers with experience working in emergency settings at four Australian metropolitan hospitals. Conventional descriptive statistics, frequency histograms, and thematic analysis were undertaken. A Wilcoxon signed-rank test examined whether a difference in preference ratings between intensive and non-intensive education delivery was evident. The questionnaire was completed by 73 radiographers (68% response rate). Radiographers reported higher confidence and self-perceived accuracy to detect traumatic abnormalities than to describe traumatic abnormalities of the musculoskeletal system. Radiographers frequently reported high desirability ratings for both the intensive and the non-intensive education delivery, no difference in desirability ratings for these two formats was evident (z = 1.66, P = 0.11). Some Australian radiographers perceive they are not ready to practise in a frontline radiographer commenting system. Overall, radiographers indicated mixed preferences for image interpretation education delivered via intensive and non-intensive formats. Further research, preferably randomised trials, investigating the effectiveness of intensive and non-intensive education formats of image interpretation education for radiographers is warranted.
Beebe, Sammy Denzil
1975-01-01
QUALITY CHARACTERISTICS OF VACUUM-PACKAGED BEEF AS AFFECTED BY POSTMORTEM CHILL, STORAGE TEMPERATURE AND STORAGE INTERVAL A Thesis by SAMMY DENZIL BEEBE Submitted to the Graduate College of Texas A1IM University in partial fulfillment... of the requirement for the deoree of MASTER OF SCIENCE December 1975 Major Subject: Animal Science QUALITY CHARACTERISTICS OF VACUUM-PACKAGED BEEF AS AFFECTED BY POSTMORTEM CHILL, STORAGE TEMPERATURE AND STORAGE INTERVAL A Thesis by SAMMY DENZIL BEEBE...
Chacron, Maurice
transmission properties. For this purpose, we employ two simple firing models, one of which generates a renewal exclusively at high frequencies, the renewal model can transfer more information than the nonrenewal modelIntegrate-and-fire neurons with threshold noise: A tractable model of how interspike interval
Vishal Midya
2015-02-06
In this work, exact mathematical functions have been formulated for three important theoretical Shruti (micro tonal interval) distributions, i.e. for Western Compilation, Deval, and Nagoji Row in Hindustani music. A generalized mathematical function for Shrutis has also been formulated. This generalized function shows a very high order of conformity with the experimentally derived Shruti distribution, than those of the theoretical Shruti distributions.
1 Interval Set Clustering of Web Users using Modified Kohonen Self-Organizing Maps based, 121 35 Prague 2, Czech Republic Abstract Web usage mining involves application of data mining techniques to discover usage patterns from the web data. Clustering is one of the important functions in web
Dan Nelson; Joseph Hardin; Iosif Lindenmaier; Bradley Isom; Karen Johnson; Nitin Bharadwaj
1990-01-01
W-Band Scanning ARM Cloud Radar (W-SACR) Hemispherical Sky RHI Scans (6 horizon-to-horizon scans at 30-degree azimuth intervals)
Dan Nelson; Joseph Hardin; Iosif Lindenmaier; Bradley Isom; Karen Johnson; Nitin Bharadwaj
2011-09-14
X-Band Scanning ARM Cloud Radar (XSACR) Hemispherical Sky RHI Scans (6 horizon-to-horizon scans at 30-degree azimuth intervals)
Dan Nelson; Joseph Hardin; Iosif Lindenmaier; Bradley Isom; Karen Johnson; Nitin Bharadwaj
2011-05-24
Ka-Band Scanning ARM Cloud Radar (KASACR) Hemispherical Sky RHI Scan (6 horizon-to-horizon scans at 30-degree azimuth intervals)
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Dan Nelson; Joseph Hardin; Iosif (Andrei) Lindenmaier; Bradley Isom; Karen Johnson; Nitin Bharadwaj
X-Band Scanning ARM Cloud Radar (XSACR) Hemispherical Sky RHI Scans (6 horizon-to-horizon scans at 30-degree azimuth intervals)
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Dan Nelson; Joseph Hardin; Iosif (Andrei) Lindenmaier; Bradley Isom; Karen Johnson; Nitin Bharadwaj
W-Band Scanning ARM Cloud Radar (W-SACR) Hemispherical Sky RHI Scans (6 horizon-to-horizon scans at 30-degree azimuth intervals)
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Dan Nelson; Joseph Hardin; Iosif (Andrei) Lindenmaier; Bradley Isom; Karen Johnson; Nitin Bharadwaj
Ka-Band Scanning ARM Cloud Radar (KASACR) Hemispherical Sky RHI Scan (6 horizon-to-horizon scans at 30-degree azimuth intervals)
Tolleson, Douglas Ray
1986-01-01
Interrelationship of Endogenous and Exogenous Prostaglandins with Uterine Involution and Postpartum Interval in Beef Cows and Heifers (August 1986) Douglas Ray Tolleson, B. S. , Texas ALM University Chairman of Advisory Committee: Dr. Ronald D. Randel A review... ALPHA PRODUCTION BY THE INVOLUTING BOVINE UTERUS AT 14 AND 35 DAYS POSTPARTUM: PATTERN OF RELEASE AND RESPONSE TO PHYSICAL MANIPULATION. 26 28 33 44 47 Introduction. . . . . . . . . Materia1s and Methods. Resu1ts. Discussion. CHAPTER V...
Pawloski, G A; Wurtz, J; Drellack, S L
2009-12-29
Pahute Mesa at the Nevada Test Site contains about 8.0E+07 curies of radioactivity caused by underground nuclear testing. The Underground Test Area Subproject has entered Phase II of data acquisition, analysis, and modeling to determine the risk to receptors from radioactivity in the groundwater, establish a groundwater monitoring network, and provide regulatory closure. Evaluation of radionuclide contamination at Pahute Mesa is particularly difficult due to the complex stratigraphy and structure caused by multiple calderas in the Southwestern Nevada Volcanic Field and overprinting of Basin and Range faulting. Included in overall Phase II goals is the need to reduce the uncertainty and improve confidence in modeling results. New characterization efforts are underway, and results from the first year of a three-year well drilling plan are presented.
Faber, V.
1994-11-29
Livelock-free message routing is provided in a network of interconnected nodes that is flushable in time T. An input message processor generates sequences of at least N time intervals, each of duration T. An input register provides for receiving and holding each input message, where the message is assigned a priority state p during an nth one of the N time intervals. At each of the network nodes a message processor reads the assigned priority state and awards priority to messages with priority state (p-1) during an nth time interval and to messages with priority state p during an (n+1) th time interval. The messages that are awarded priority are output on an output path toward the addressed output message processor. Thus, no message remains in the network for a time longer than T. 4 figures.
Gandhi, Rajiv C.
Sub-coloring and Hypo-coloring Interval Graphs Rajiv Gandhi1, Bradford Greening, Jr.1, Sriram, Iowa 52242. E-mail: sriram@cs.uiowa.edu. 3 Max-Planck Institute for Informatik, SaarbrÂ¨ucken, Germany
M. P. Freeman; N. W. Watkins; D. J. Riley
2000-06-28
We calculate for the first time the probability density functions (PDFs) P of burst energy e, duration T and inter-burst interval tau for a known turbulent system in nature. Bursts in the earth-sun component of the Poynting flux at 1 AU in the solar wind were measured using the MFI and SWE experiments on the NASA WIND spacecraft. We find P(e) and P(T) to be power laws, consistent with self-organised criticality (SOC). We find also a power law form for P(tau) that distinguishes this turbulent cascade from the exponential P(tau) of ideal SOC, but not from some other SOC-like sandpile models. We discuss the implications for the relation between SOC and turbulence.
Indoor Thermal Factors and Symptoms in Office Workers: Findings from the U.S. EPA BASE Study
Mendell, Mark; Mirer, Anna
2008-06-01
Some prior research in office buildings has associated higher indoor temperatures even within the recommended thermal comfort range with increased worker symptoms. We reexamined this relationship in data from 95 office buildings in the U.S. Environmental Protection Agency's Building Assessment Survey and Evaluation (BASE) Study. We investigated relationships between building-related symptoms and thermal metrics constructed from real-time measurements. We estimated odds ratios (ORs) and 95percent confidence intervals in adjusted logistic regression models with general estimating equations, overall and by season. Winter indoor temperatures spanned the recommended winter comfort range; summer temperatures were mostly colder than the recommended summer range. Increasing indoor temperatures, overall, were associated with increases in few symptoms. Higher winter indoor temperatures, however, were associated with increases in all symptoms analyzed. Higher summer temperatures, above 23oC, were associated with decreases in most symptoms. Humidity ratio, a metric of absolute humidity, showed few clear associations. Thus, increased symptoms with higher temperatures within the thermal comfort range were found only in winter. In summer, buildings were overcooled, and only the higher observed temperatures were within the comfort range; these were associated with decreased symptoms. Confirmation of these findings would suggest that thermal management guidelines consider health effects as well as comfort.
The Dark Matter Halos of Massive, Relaxed Galaxy Clusters Observed With Chandra
Schmidt, Robert W.; /Heidelberg, Astron. Rechen Inst.; Allen, S.W.; /KIPAC, Menlo Park
2006-10-11
We use the Chandra X-ray Observatory to study the dark matter halos of 34 massive, dynamically relaxed galaxy clusters, spanning the redshift range 0.06 < z < 0.7. The observed dark matter and total mass (dark-plus-luminous matter) profiles can be approximated by the Navarro Frenk & White (hereafter NFW) model for cold dark matter (CDM) halos; for {approx} 80 percent of the clusters, the NFW model provides a statistically acceptable fit. In contrast, the singular isothermal sphere model can, in almost every case, be completely ruled out. We observe a well-defined mass-concentration relation for the clusters with a normalization and intrinsic scatter in good agreement with the predictions from simulations. The slope of the mass-concentration relation, c {infinity} M{sub vir}{sup a}/(1 + z){sup b} with a = -0.41 {+-} 0.11 at 95 percent confidence, is steeper than the value a {approx} -0.1 predicted by CDM simulations for lower mass halos. With the slope a included as a free fit parameter, the redshift evolution of the concentration parameter, b = 0.54 {+-} 0.47 at 95 percent confidence, is also slower than, but marginally consistent with, the same simulations (b {approx} 1). Fixing a {approx} -0.1 leads to an apparent evolution that is significantly slower, b = 0.20 {+-} 0.45, although the goodness of fit in this case is significantly worse. Using a generalized NFW model, we find the inner dark matter density slope, a, to be consistent with unity at 95 percent confidence for the majority of clusters. Combining the results for all clusters for which the generalized NFW model provides a good description of the data, we measure ? = 0.88 {+-} 0.29 at 95 percent confidence, in agreement with CDM model predictions.
An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Chan, Anthony; Gropp, William; Lusk, Ewing
2008-01-01
A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore »proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file ofannotationsthat may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less
Buyyounouski, Mark K. [Department of Radiation Oncology, Fox Chase Cancer Center, Philadelphia, PA (United States)], E-mail: mark.buyyounouski@fccc.edu; Hanlon, Alexandra L.; Horwitz, Eric M.; Pollack, Alan [Department of Radiation Oncology, Fox Chase Cancer Center, Philadelphia, PA (United States)
2008-01-01
Purpose: Few biochemical parameters have been related to mortality. The present study examined the clinical utility of the interval to biochemical failure (IBF) as a prognostic factor for distant metastasis (DM) and prostate cancer-specific mortality (PCSM) after radiotherapy. Methods and Materials: The study group consisted of 211 T1c-T3Nx-N0M0 patients who had experienced BF among 1,174 men treated with three-dimensional conformal radiotherapy alone. Biochemical failure was defined as a post-treatment prostate-specific antigen (PSA) level of at, or greater than, the PSA nadir plus 2 ng/mL. Cox proportional hazards modeling was used to identify independent predictors of DM and PCSM on multivariate analysis. Results: An IBF of <18 months was independently predictive for DM (p = 0.008), as was a Gleason score of 7-10 (p = 0.0005), PSA nadir {>=}2 ng/mL (p = 0.04), and decreasing radiation dose (p = 0.02) on multivariate analysis, including increasing pretreatment PSA level, PSA nadir {>=}2.5 ng/mL, PSA doubling time of <3 months, and Stage T3 disease. An IBF of <18 months was the only predictor of PCSM (p = 0.0003) in the same model. The actuarial 5-year DM rate for an IBF of <18 vs. {>=}18 months was 52% vs. 20% (p < 0.0001), and the actuarial PCSM rate was 36% vs. 6%, respectively (p = 0.0001). Conclusions: The IBF is an important descriptor of the PSA kinetics after radiotherapy to identify men at high risk of clinical failure and death. A IBF of <18 months could aid in selecting men for early, aggressive salvage therapy or participation in a clinical trial.
Zeilberger, Doron
nx}, n = 1, 2, . . . is orthogonal over the interval [0, ]. Also find the norm of each function. Sol. We need to take two different, typical members of this family, so let's call them cos nx and cos mx, where n = m. We have to show that (cos mx, cos nx) = 0. (cos mx, cos nx) = 0 cos mx cos nx dx . We now
Hart, Gus
Medical Physiology Online (MPO) http://www.medicalphysiologyonline.org Wilson BJ, Hart GLW, Parcell. Wilson 1 , Gus L. W. Hart 2 , and Allen C. Parcell 3 1 Department of Recreation Management and Youth://www.medicalphysiologyonline.org Wilson BJ, Hart GLW, Parcell AC. Cardiac Inter-Beat Interval Complexity is Influenced by Physical
SIAM J. ScI. STAT. COMPUT. Vol. 7, No. 2, April 1986
O'Leary, Dianne P.
Mathematics OO9 CONFIDENCE INTERVALS FOR INEQUALITY-CONSTRAINED LEAST SQUARES PROBLEMS, WITH APPLICATIONS
Yun-Ming Dong; Yi-Ping Qin
2005-03-16
In the present paper, we investigated the distribution of hardness ratio (HR) for short and long gamma-ray bursts (GRBs) in different time scales for the first two seconds. After including and subtracting the background count, we performed a Kolmogorov--Smirnov (K-S) test to the HR distributions of the two classes of GRBs in each time interval. Our analysis shows that the probabilities of the KS test to the distributions are very small, suggesting that the two classes of bursts are unlikely to arise from the same HR distributions, and indicating that they probably originate from the different physical processes and central engine. In addition, we found that the hardness ratio of short bursts within the time interval of 0$-$0.96 seconds changes hard-to-soft, on the other hand long bursts do not. The two kinds of bursts have different characteristics in the first 2 seconds which might be associated with different physical mechanisms.
Gonzalez, Francisco Manuel
1987-01-01
. 1, dpwo 0. 3, 6pog 0 6 24 Figure 4 Effect of re/rw on Critical Rate for Two Phase Problem. rDe 10, xD 0. 1, dpwo 0. 3, &pog 0. 6 . . . 26 Figure 5 Critical Oil Rate as a Function of Completion Interval Location and Length for the Three Phase... 9 Effect of re/rw on Critical Rate for Three Phase Problem. (Water Influenced Zone). rDe = 5, e = 0. 1, dpwo 3 &Pog 32 xi i Page Figure 10 - Two Phase Critical Rate Correlation. qDc predicted vs. qDc actual. 38 Figure 11 - Three Phase Critical...
23 23.2 23.4 23.6 23.8 24 24.2 (24) (3-minute) reference gas intervals: 450 ppm co2, sf=10 Hz
Saltzman, Eric
) reference gas intervals: 450 ppm co2, sf=10 Hz co2 day of year 2006 licormotionmodel.m, licormotionmodel) reference gas intervals: 450 ppm co2, sf=10 Hzco2 0.5 1 1.5 2 2.5 3 3.5 4 x 10 4 -2 -1 0 accx 0.5 1 1.5 2 2 x 10 4 445 450 455 460 (24) concatenated (3-minute) reference gas intervals: 450 ppm co2, sf=10 Hzco
Orlove, Steven T.; Smith, Charles W.; Vasquez, Bernard J.; Schwadron, Nathan A.; Skoug, Ruth M.; Zurbuchen, Thomas H.; Zhao Liang E-mail: Charles.Smith@unh.edu E-mail: N.Schwadron@unh.edu E-mail: thomasz@umich.edu
2013-09-01
We have examined 226 intervals of nearly radial interplanetary magnetic field orientations at 1 AU lasting in excess of 6 hr. They are found within rarefaction regions as are the previously reported high-latitude observations. We show that these rarefactions typically do not involve high-speed wind such as that seen by Ulysses at high latitudes during solar minimum. We have examined both the wind speeds and the thermal ion composition before, during and after the rarefaction in an effort to establish the source of the flow that leads to the formation of the rarefaction. We find that the bulk of the measurements, both fast- and slow-wind intervals, possess both wind speeds and thermal ion compositions that suggest they come from typical low-latitude sources that are nominally considered slow-wind sources. In other words, we find relatively little evidence of polar coronal hole sources even when we examine the faster wind ahead of the rarefaction regions. While this is in contrast to high-latitude observations, we argue that this is to be expected of low-latitude observations where polar coronal hole sources are less prevalent. As with the previous high-latitude observations, we contend that the best explanation for these periods of radial magnetic field is interchange reconnection between two sources of different wind speed.
Harris, S.; Gross, R.; Mitchell, E.
2011-01-18
The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.
Totsl length (mm) Sample 95% confidence
. Bear Mountain, N.Y., March 28-301976. Hudson River Environmen- tal Society, Inc. FISHERY BULLETIN: VOL
Standards Increase Market Confidence in SSL Performance
2013-09-30
Fact sheet that reviews current and future SSL standards developed by DOE and other standards-setting organizations to effectively measure and characterize SSL lighting products.
The CASL vision is to confidently predict
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity ofkandz-cm11 Outreach Home RoomPreservationBio-Inspired Solar FuelTechnologyTel: Name: Rm. Tel:Test1 April 2000 Anroad to E x
Frequent-Interval Seismic CPTu
Office of Environmental Management (EM)
site characterization. Evaluation of non-linear soil behavior... detailed stratigraphy small-strain velocity measurements large-strain non-seismic measurements...
Blin, Guillaume
Conclusion Comparing genomes Genomes evolved from a common ancestor tend to share the same varieties of gene clusters used in genomes comparison. . . . seeking for gene clusters between their genomes. A gene cluster = a set of genes appearing, in spatial proximity along the chromosome, in at least two genomes. G. Blin
Wong, Limsoon
associated with true-positive protein interactions--e.g., "new interaction gener- ality" (IG2) and "meso-scale comprises the "new interaction generality" (IG2) and "meso-scale motifs" (NeMoFinder) indices. This g
Masaru Ikehata
2013-10-30
This paper considers an inverse problem for the classical wave equation in an exterior domain. It is a mathematical interpretation of an inverse obstacle problem which employs the dynamical scattering data of acoustic wave over a finite time interval. It is assumed that the wave satisfies a Robin type boundary condition with an unknown variable coefficient. The wave is generated by the initial data localized outside the obstacle and observed over a finite time interval at the same place as the support of the initial data. It is already known that, using the enclosure method, one can extract the maximum sphere whose exterior encloses the obstacle, from the data. In this paper, it is shown that the enclosure method enables us to extract also: (i) a quantity which indicates the deviation of the geometry between the maximum sphere and the boundary of the obstacle at the first reflection points of the wave; (ii) the value of the coefficient of the boundary condition at an arbitrary first reflection point of the wave provided, for example, the surface of the obstacle is known in a neighbourhood of the point. Another new obtained knowledge is that: the enclosure method can cover the case when the data are taken over a sphere whose centre coincides with that of the support of an initial data and yields corresponding results to (i) and (ii).
Bowcock, A.M.; Osborne-Lawrence, S. ); Anderson, L.A.; Friedman, L.S.; Rowell, S.E.; Hall, J.M.; King, M.C. ); Black, D.M.; Solomon, E. )
1993-04-01
In order to pinpoint the locale of the gene for early-onset familial breast and ovarian cancer (BRCA1), polymorphisms were developed within the locus for thyroid hormone receptor alpha (THRA1) and for several anonymous sequences at chromosome 17q12-q21. The THRA1 polymorphism is a dinucleotide repeat with 10 alleles and heterozygosity .79. Gene mapping in extended families with inherited, early-onset breast and ovarian cancer indicates that BRCA1 is distal to THRA1 and proximal to D17S183 (SCG43), an interval of <4 cM. This locale excludes HER2, THRA1, WNT3, HOX2, NGFR, PHB, COLIA1, NME1, and NME2 as candidates for BRCA1 but does not exclude RARA or EDH17B. Resolving the remaining recombination events in these families by new polymorphisms in the THRA1-D17S183 interval will facilitate positional cloning of the breast-ovarian cancer gene on chromosome 17q12-q21. 16 refs., 3 figs., 1 tab.
Kreinovich, Vladik
, Mariana Pe~na1 , Mathew J. Rister1 , Abraham Salda~na1 , John Vasquez1 , Janelle Ybarra1 , and Salem
Eliminating Duplicates Under Interval and Fuzzy Uncertainty
Kreinovich, Vladik
and Mexico and parts of Africa. The geophysical use of gravity database compiled at UTEP is illustrated records. Why duplicates are a problem. Duplicate values can corrupt the results of statistical data
OPTIMAL INTERVAL ENCLOSURES FOR FRACTIONALLYLINEAR FUNCTIONS,
Kreinovich, Vladik
El Paso TX 79968, USA, email vladik@cs.ep.utexas.edu 3 Sistemas de Informacion, Division de Ingeneria y Ciencias, ITESM (Instituto Technologico de Monterrey), Campus Estado de Mexico, Apdo. Postal 2
Interval Analysis for Unknown Dependencies and Genetic
: Power Systems Engineering Research Center Cornell University 428 Phillips Hall Ithaca, New York 14853 given to MidAmerican Energy for its support of this project. Thanks are also given to our industry advisors: · O. Dale Stevens, II, MidAmerican Energy Co. · John Thomas Chatelain, MidAmerican Energy Co. #12
Lyapunov Spectral Intervals: Theory and Computation
Dieci, Luca; Van Vleck, Erik
2002-05-05
exponents in stability theory. Important results on stability of Lyapunov exponents that we use are due to Bylov [6], Bylov et al. [5], Bylov and Izobov [7], and Millionshchikov [24, 25]. An alternative to the spectrum of Lyapunov is based upon defining a... School of Mines, Golden, CO 80401 (evanvlec@mines.edu). 516 D ow nl oa de d 09 /2 9/ 14 to 1 29 .2 37 .4 6. 10 0. R ed ist rib ut io n su bje ct to SIA M lic en se or co py rig ht; se e h ttp ://w ww .si am .or g/j ou rna ls/ ojs a.p hp...
Professional Role Confidence and Gendered Persistence in Engineering
Cech, Erin
Social psychological research on gendered persistence in science, technology, engineering, and mathematics (STEM) professions is dominated by two explanations: women leave because they perceive their family plans to be at ...
Using Subjective Confidence to Improve Metacognitive Monitoring Accuracy and Control
Miller, Tyler
2012-10-19
Metacognition is defined as a person's awareness of the capabilities and vulnerabilities of their own cognition and also encompasses the actions that a person takes as a result of that awareness. The awareness and actions that a person takes...
Special Focus on High-Confidence Software Technologies SCIENCE CHINA
Yang, Yun
, China; 3School of Software and Electrical Engineering, Swinburne University of Technology, Hawthorn VIC of high-level middleware services for different comput- ing paradigms such as cluster, grid, and cloud intensive e-science applications such as weather fore- cast earthquake modeling, and astrophysics [4
Pair programming improves student retention, confidence, and program quality
McDowell, C; Werner, L; Bullock, H E; Fernald, J
2006-01-01
Werner, L.L. Building Pair Programming Knowledge through aJ. The Impact of Pair Programming on Student Performance andto Know About Pair Programming I Learned in Kindergarten.
Confidence-Based Robot Policy Learning from Demonstration
Veloso, Manuela M.
Thesis Committee: Manuela Veloso, Chair Christopher Atkeson Avrim Blum Cynthia Breazeal (MIT Media Lab am also thankful to my entire thesis committee, Chris Atkeson, Avrim Blum and Cynthia Breazeal labmates and friends at Carnegie Mellon, especially Colin McMillen, Doug Vail, Liz Crawford, Scott Lenser
Status Update: Extended Storage and Transportation Waste Confidence
Broader source: Energy.gov [DOE]
Presentation made by David W. Pstrak for the NTSF annual meeting held from May 14-16, 2013 in Buffalo, NY.
Professional Role Confidence and Gendered Persistence in Engineering
Cech, Erin; Rubineau, Brian; Silbey, Susan; Seron, Caroll
2011-01-01
Evidence from the Leveraged Buyout Industry. ” Americanprofession- als in the leveraged buyout industry), it may be
STATISTICS OF PRECIPITATION EXTREMES: QUANTIFYING CONFIDENCE IN TRENDS
Katz, Richard
temperature, wind speed, sea level #12;13 · Block maxima approach #12;14 #12;15 Estimated 100-yr (i. e., p = 0) -- Point process combines Poisson and GP into single model · Difficulties -- Choice of threshold;21 · Trends -- Example (Mercer Creek, WA) Effect of urbanization on stream flow in small water basin: Lack
Shan-Guang Tan
2015-06-15
The representation of even numbers as the sum of two primes and the distribution of primes in short intervals were investigated and a main theorem was given out and proved, which states: For every number $n$ greater than a positive number $n_{0}$, let $q$ be an odd prime number smaller than $\\sqrt{2n}$ and $d=2n-q$, then there is always at least an odd number $d$ which does not contain any prime factor smaller than $\\sqrt{2n}$ and must be an odd prime number greater than $2n-\\sqrt{2n}$. Then it was proved that for every number $n$ greater than 1, there are always at least a pair of primes $p$ and $q$ which are symmetrical about the number $n$ so that even numbers greater than 2 can be expressed as the sum of two primes. Hence, the Goldbach's conjecture was proved. Also theorems of the distribution of primes in short intervals were given out and proved. By these theorems, the Legendre's conjecture, the Oppermann's conjecture, the Hanssner's conjecture, the Brocard's conjecture, the Andrica's conjecture, the Sierpinski's conjecture and the Sierpinski's conjecture of triangular numbers were proved and the Mills' constant can be determined. The representation of odd numbers as the sum of an odd prime number and an even semiprime was investigated and a main theorem was given out and proved, which states: For every number $n$ greater than a positive number $n_{0}$, let $q$ be an odd prime number smaller than $\\sqrt{2n}$ and $d=2n+1-2q$, then there is always at least an odd number $d$ which does not contain any odd prime factor smaller than $\\sqrt{2n}$ and must be a prime number greater than $2n+1-2\\sqrt{2n}$. Then it was proved that for every number $n$ greater than 2, there are always at least a pair of primes $p$ and $q$ so that all odd integers greater than 5 can be represented as the sum of an odd prime number and an even semiprime. Hence, the Lemoine's conjecture was proved.
Analysis Of Macroscopic Fractures In Granite In The Hdr Geothermal...
core section over a depth interval from 1420 to 2230 m: 97% of the macroscopic structures were successfully reorientated with a good degree of confidence by comparison...
Unified approach to the classical statistical analysis of small signals Gary J. Feldman*
Feldman, Gary
-sided intervals leads to intervals which are not confidence intervals if the choice is based on the data. We apply and Astronomy, University of California, Los Angeles, California 90095 Received 21 November 1997; published 6 led the Particle Data Group PDG 2 to describe procedures for Bayesian interval construction
Ramin Zahedi
2014-12-28
In this paper based on a sort of linear function, a deterministic and simple algorithm with an algebraic structure is presented for calculating all (and only) $k$-almost primes ($where$ $\\exists n\\in {\\rm N} $, $1{\\le} k {\\le}n$) in certain interval. A theorem has been proven showing a new deterministic property of the category of $k$-almost primes. Through a linear function that we obtain, an equivalent redefinition of the $k$-almost primes with an algebraic characteristic is identified. Moreover, as an outcome of our function's property some relations which contain new information about the $k$-almost primes (including primes) are presented.
Figure 6. Projected Production for the Low Development Rate of...
U.S. Energy Information Administration (EIA) Indexed Site
6. Projected Production for the Low Development Rate of Technically Recoverable Oil Estimated at 5 Percent, Mean, and 95 Percent Probabilities for the ANWR Coastal Plain of the...
Towards Combining Probabilistic, Interval, Fuzzy Uncertainty, and Constraints: On the
Kreinovich, Vladik
resources such as the oil in the Middle East. However, nowadays, most easyÂtoÂaccess mineral resources have resources and in the search for natural resources, it is very important to determine Earth structure. Our civilization greatly depends on the things we extract from the Earth, such as fossil fuels (oil, coal, natural
Towards Combining Probabilistic, Interval, Fuzzy Uncertainty, and Constraints: On the
Kreinovich, Vladik
resources such as the oil in the Middle East. However, nowadays, most easy-to-access mineral resources have resources and in the search for natural resources, it is very important to determine Earth structure. Our civilization greatly depends on the things we extract from the Earth, such as fossil fuels (oil, coal, natural
Application-Motivated Combinations of Fuzzy, Interval, and Probability Approaches,
Kreinovich, Vladik
to Geoinformatics, Bioinformatics, and Engineering Vladik Kreinovich Department of Computer Science University
ApplicationMotivated Combinations of Fuzzy, Interval, and Probability Approaches,
Kreinovich, Vladik
to Geoinformatics, Bioinformatics, and Engineering Vladik Kreinovich Department of Computer Science University
Tolerance Intervals for Exponentiated Scale Family of Distributions
Kundu, Debasis
& D. KUNDU Ã Department of Statistics, Shivaji University, Kolhapur, India; ÃÃ Department of Statistics, Kisan Veer Mahavidyalaya, Wai Satara, India; Department of Mathematics, Indian Institute Correspondence Address: D.T. Shirke, Department of Statistics, Shivaji University, Kolhapur, India, 416004. Email
Gauge Theories on an Interval: Unitarity Without a Higgs Boson
Csaki, Csaba; Grojean, Christophe; Murayama, Hitoshi; Luigi, Pilo; Terning, John
2004-01-01
breaking without a Higgs boson. Gauge Theories on anscattering amplitude. The Higgs boson is localized at y = ?Rreal scalar ?eld, the Higgs boson. At tree level, the
Intelligent Control in Space Exploration: Interval Computations are Needed
Kreinovich, Vladik
missions, but also in the chemical industry, in metallurgy, in business). These experts usually cannot
Rigorous investigations of Ikeda map by means of interval arithmetic
Galias, Zbigniew
of Electrical Engineering, University of Mining and Metallurgy al. Mickiewicza 30, 30Â059 KrakÂ´ow, Poland e
Rigorous investigations of Ikeda map by means of interval arithmetic
Galias, Zbigniew
of Electrical Engineering, University of Mining and Metallurgy al. Mickiewicza 30, 30--059 Krakâ??ow, Poland e
OPTIMAL INTERVAL COMPUTATION TECHNIQUES: OPTIMIZATION OF NUMERICAL METHODS
Kreinovich, Vladik
Paso, TX 79968, USA, email vladik@cs.utep.edu 2 Sistemas de Informacion, Division de Ingeneria y Ciencias, ITESM (Instituto Technologico de Monterrey) Campus Estado de Mexico, Apdo. Postal 2, Modulo de
Unstable AMOC during glacial intervals and millennial variability...
Office of Scientific and Technical Information (OSTI)
Number: NEK0059281; AGS-1405272; SC0007037; MESO-CLIP Type: Published Article Journal Name: Earth and Planetary Science Letters Additional Journal Information: Journal...
Unstable AMOC during glacial intervals and millennial variability...
Office of Scientific and Technical Information (OSTI)
Florian; Fedorov, Alexey V. Elsevier None USDOE Netherlands 2015-11-01 English Journal Article Journal Name: Earth and Planetary Science Letters; Journal Volume: 429;...
Interval operations in rounding to nearest Siegfried M. Rump
Rump, Siegfried M.
-point arithmetic, rounding to nearest, predecessor, successor, directed rounding AMS subject classification (2000 rounding mode, to nearest "ties to even" and the rounding to nearest "ties to away" (away from zero-1074 Let be the radix used in this floating-point format. We require to be even and greater than one
Use of Utility Interval Meters in an Industrial Energy Audit
Wallace, M.
2007-01-01
4 :00 A M 5: 00 A M 6: 00 AM 7: 00 AM 8: 00 AM 9 :00 AM 10: 0 0 A M 1 1: 00 A M 12 :00 P M 1: 0 0 P M 2: 00 P M 3: 00 P M 4: 00 PM 5: 0 0 P M 6: 0 0 P M 7 :00 PM 8: 00 P M 9: 00 P M 10 :00 P M 11: 00 P M KW Max Min Ave Max Count Figure 2... M 10:0 0 AM 11:0 0 A M 12:0 0 PM 1:00 P M 2:00 P M 3:00 P M 4:00 P M 5:00 P M 6:00 P M 7:00 P M 8:00 P M 9:00 P M 10:0 0 PM 11:0 0 P M KW -2000 -1500 -1000 -500 0 500 1000 1500 Del t a- KW Demand Delta Demand -665 +682 -1273 -1221 -913...
Interval Approach to Identification of Catalytic Process Parameters
Kearfott, R. Baker
. Mikushina Institute of Organic Synthesis, Ural Branch of Russian Academy of Sciences, Ekaterinburg, Russia of Mathematics and Mechanics, Ural Branch of Russian Academy of Sciences, and Ural Federal University, Institute of Radio-Electronics and Informational Technologies, Ekaterinburg, Russia kumkov@imm.uran.ru Yuliya V
Real-Time Correction of Heart Interbeat Intervals
Hoover, Adam
Heartrate variability (HRV) is traditionally ana- lyzed while a subject is in a controlled environment, such as at rest in a clinic, where it can be used as a medical indicator. This paper concerns analyzing HRV heartbeats. Heartrate variability (HRV) analysis studies cyclical variations in a heartbeat series related
QT-interval adaptation to changes in autonomic balance
Nosakhare, Ehimwenma
2013-01-01
ECG variability, as it relates to the influence of the autonomic nervous system on the heart, is primarily studied via frequency-domain and time-domain analysis of heart rate variability (HRV). HRV studies the variability ...
Guaranteed state estimation by zonotopes for systems with interval uncertainties
Damm, Werner
. Stoica T. Alamo E.F. Camacho D. Dumur This talk focuses on guaranteed state estimation by zonotopes [1 example. References [1] T. Alamo, J.M. Bravo, and E.F. Camacho. Guaranteed state estimation by zonotopes. Automatica, 41:10351043, 2005. [2] V.T.H. Le, T. Alamo, E.F. Camacho, C. Stoica, and D. Dumur. A new
Exact Bounds for Interval Functions Under Monotonicity Constraints,
Ward, Karen
to Paleontology Emil Platon Energy & Geoscience Institute University of Utah 423 Wakara Way, Suite 300 Salt Lake that are the closest to the surface are the least disturbed by drilling. In both cases, for the selected fossil, we
Exact Bounds for Interval Functions Under Monotonicity Constraints,
Ward, Karen
to Paleontology Emil Platon Energy & Geoscience Institute University of Utah 423 Wakara Way, Suite 300 Salt Lake the fossils that are the closest to the surface are the least disturbed by drilling. In both cases
Computing minimum geodetic sets in proper interval graphs
Heggernes, Pinar
, denoted by NG(v), is the set of vertices of G that are adjacent to v. For a set S of vertices of G, G
Revealing Hidden Interval Graph Structure in STSContent Data
Bonner, Anthony
accom plishments of the Human Genome Project to date. In genome parlance, a map is a collection of spa with examples of its application to current STS data from human genome centers. Availability: Freely availablecomplete contig map of the human genome. Eventually they joined forces with each other, with several other groups
Interval Krawczyk and Newton method February 20, 2007
Zgliczynski, Piotr
on X, which implies assertion 0 and the uniqueness part in assertion 1. We have for any x0, x1 X f(x1 the uniqueness it is enough show that Nm is a contraction on U. Observe that it is impossible to verify the middle value form of Nm can cure this deficiency. If x0 [x], then Nm([x]) Nm(x0) + [dNm([x])]I · ([x
Exact Bounds for Interval and Fuzzy Functions Under Monotonicity Constraints,
Ward, Karen
of fossil species in samples recovered from a well that penetrates an undisturbed sequence of sedimentary the environment in which rocks have accumulated: for example, a coral is an unambiguous indication of a warm ocean that in a normal sequence the age increases with the depth in the well that penetrates that sequence. So
An Assessment of Interval Data and Their Potential Application to
Gasoline and Diesel Fuel Update (EIA)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity of Natural GasAdjustments (Billion Cubic Feet) Wyoming Dry Natural Gas ReservesAlabamaAbout EIA.gov ScreenResidential
Bonomo, Flavia
Â´on, FCEyN, Universidad de Buenos Aires, Buenos Aires, Argentina 3 Depto. de MatemÂ´atica and Instituto de CÂ´alculo
Western University Rehabilitation Services
Lennard, William N.
1 Western University Rehabilitation Services Transitional Accommodation Program Western by Rehabilitation Services and is updated at frequent intervals in order to: 1. confirm progression toward treatment information is held in the strictest confidence within Rehabilitation Services. Only capabilities
Process for estimating likelihood and confidence in post detonation nuclear forensics.
Darby, John L.; Craft, Charles M.
2014-07-01
Technical nuclear forensics (TNF) must provide answers to questions of concern to the broader community, including an estimate of uncertainty. There is significant uncertainty associated with post-detonation TNF. The uncertainty consists of a great deal of epistemic (state of knowledge) as well as aleatory (random) uncertainty, and many of the variables of interest are linguistic (words) and not numeric. We provide a process by which TNF experts can structure their process for answering questions and provide an estimate of uncertainty. The process uses belief and plausibility, fuzzy sets, and approximate reasoning.
Key challenges to model-based design : distinguishing model confidence from model validation
Flanagan, Genevieve (Genevieve Elise Cregar)
2012-01-01
Model-based design is becoming more prevalent in industry due to increasing complexities in technology while schedules shorten and budgets tighten. Model-based design is a means to substantiate good design under these ...
Doty, Sharon Lafferty
Manage Operational and Business Risk Synthesize Information and Inform Campus Manage Investment Manage Risk 20. Risk Management--Cost of Risk per $1 dollar of Operating Expense (R3) $.0052 $.0059 Productivity: Finance & Facilities vs.. U.S. Department of Labor 14.4% 7.8% No gap Purchase Goods and Services
Challenging government: institutional arrangements, policy shocks, and no-confidence motions
Williams, Laron Kenneth
2009-05-15
Dion, faced a difficult choice as about 2 dozen Liberals had voted to extend the mission in May, including Dion himself. If he tried to whip the Liberals into supporting the motion, he would risk punishing and potentially losing a number of party... governments. Great Britain 1979-1981 After the British general election of May 3, 1979, the Conservatives were in a strong position. With the help of the Scottish National Party (SNP), they had just defeated the minority Labour government on a no...
Odegard, Ryan Glenn
2008-01-01
The growing size, complexity and demands of engineering systems requires paying greater attention to the initial design of the system concept. To improve the process by which concept design is carried out, this thesis ...
Model-Based Methodology for Building Confidence in a Dynamic Measuring System
Reese, Isaac Mark
2013-05-03
#2 Test #24 (400 lbf 11fps) ...................................................................... 36 Figure 12. Set #3 Test #44 (600 lbf 16 fps) ..................................................................... 37 Figure 13. Set #4 Test #64... (400 lbf 7.8 fps) .................................................................... 37 Figure 14. Set #5 Test #84 (600 lbf 7.8 fps) .................................................................... 38 Figure 15. Accelerometer Data from Test #82...
Proving the Integrity of the Weighted Sum Squared Error (WSSE) Loran Cycle Confidence
Stanford University
efforts on the Department of Transportation's technical evaluation of Loran. Per Enge is a professor Department at the U. S. Coast Guard Academy, in New London, CT. After his retirement from the Coast Guard, he
Brim, Cornelia P.
2013-04-01
An important requirement for the international safeguards community is the ability to determine the enrichment level of uranium in gas centrifuge enrichment plants and nuclear fuel fabrication facilities. This is essential to ensure that countries with nuclear nonproliferation commitments, such as States Party to the Nuclear Nonproliferation Treaty, are adhering to their obligations. However, current technologies to verify the uranium enrichment level in gas centrifuge enrichment plants or nuclear fuel fabrication facilities are technically challenging and resource-intensive. NNSA’s Office of Nonproliferation and International Security (NIS) supports the development, testing, and evaluation of future systems that will strengthen and sustain U.S. safeguards and security capabilities—in this case, by automating the monitoring of uranium enrichment in the entire inventory of a fuel fabrication facility. One such system is HEVA—hybrid enrichment verification array. This prototype was developed to provide an automated, nondestructive assay verification technology for uranium hexafluoride (UF6) cylinders at enrichment plants.
Stark, Philip B.
?_r=1&nl=todaysheadlines&emc=edit_th_20120522 · Study commissioned by USDoJ re Child Online Protection CHILD PORNOGRAPHY · Exemptions for literary, artistic, and educational content, ISPs, search engines. · Requires age screen for commercial porn. · Credit card number deemed adequate proof of age. #12;Background
BOUNDED CONFIDENCE OPINION DYNAMICS WITH NETWORK CONSTRAINTS AND LOCALIZED DISTRIBUTED AVERAGING
Rabbat, Michael
Michael Rabbat Electrical and Computer Engineering McGill University MontrÂ´eal, QuÂ´ebec, Canada ABSTRACT, band- width, and energy. A related line of work has been pursued in the sociology and physics
Bever, Caitlin Anne
2008-01-01
Many cellular processes are governed by large and highly-complex networks of chemical interactions and are therefore difficult to intuit. Computational modeling provides a means of encapsulating information about these ...
IMPACT OF EXCITATION FREQUENCY ON SHORT-TERM RECORDING SYNCHRONISATION AND CONFIDENCE ESTIMATION
at the will of their users. This leaves us with the metadata and audiovisual signals to infer synchro- nisation information. The available camera time and re- cording time in the metadata are based on the personal cap- turing devices.g. sensor, lens, microphones). There- fore, raw audiovisual signals are not suitable for synchroni- sation
Confidence from uncertainty - A multi-target drug screening method from robust control theory
Luni, Camilla; Shoemaker, Jason E; Sanft, Kevin R; Petzold, Linda R; Doyle, Francis J
2010-01-01
method from robust control theory. BMC Systems Biology 2010method from robust control theory Camilla Luni 1 , Jason Eof a method from robust control theory, Structured Singular
Geothermal reservoir simulation to enhance confidence in predictions for nuclear waste disposal
Kneafsey, Timothy J.; Pruess, Karsten; O'Sullivan, Michael J.; Bodvarsson, Gudmundur S.
2002-01-01
California The Mammoth geothermal field is a single–phase, liquid–dominated field with a 40 MW power plant.
On Using Nearly-Independent Feature Families for High Precision and Confidence
Toronto, University of
families and different ways of processing the different signals. For example, YouTube videos contain audio, gradient and motion-related histogram features extracted from the visual signal. Given access to such rich such as text, audio, and video features are available, combining the outputs of base classifiers trained
Determining X-ray source intensity and confidence bounds in crowded fields
Primini, F. A.; Kashyap, V. L.
2014-11-20
We present a rigorous description of the general problem of aperture photometry in high-energy astrophysics photon-count images, in which the statistical noise model is Poisson, not Gaussian. We compute the full posterior probability density function for the expected source intensity for various cases of interest, including the important cases in which both source and background apertures contain contributions from the source, and when multiple source apertures partially overlap. A Bayesian approach offers the advantages of allowing one to (1) include explicit prior information on source intensities, (2) propagate posterior distributions as priors for future observations, and (3) use Poisson likelihoods, making the treatment valid in the low-counts regime. Elements of this approach have been implemented in the Chandra Source Catalog.
Lyons, Jeffrey M. (Jeffrey Michael), 1973-
2000-01-01
As the use of distributed engineering models becomes more prevalent, engineers need tools to evaluate the quality of these models and understand how subsystem uncertainty affects predictions of system behavior. This thesis ...
The committee says that although public confidence in agriculture is at an all-time low
McDonald, Kirk
to short-term research needs. Washington US high-energy physicists, facing deep bud- get cuts and divided nor sustainable". A report from the House of Commons select committee on science and technology points million research programme. Peter Rosen, head of the high energy and nuclear physics office
Rapid Deployment with Confidence:Calibration and Fault Detection in Environmental Sensor Networks
2006-01-01
wireless sensor network (WSN). This model, which holds greatterm, autonomous, and static WSN deployment model. Rapidlyunachieved. Additionally, as WSN technology is in its
New ITER head is confident the fusion energy project will succeed
Kramer, David
2015-05-15
Bernard Bigot sees management of the seven-party international effort as a greater challenge than the technological demands.
Geothermal reservoir simulation to enhance confidence in predictions for nuclear waste disposal
Kneafsey, Timothy J.; Pruess, Karsten; O'Sullivan, Michael J.; Bodvarsson, Gudmundur S.
2002-01-01
for a Potential High-Level Nuclear Waste Repository at YuccaHeat Flow Near High-Level Nuclear Waste Packages Emplaced inNear a High-Level Nuclear Waste Repository in Partially
Kurtz, S.; Wohlgemuth, J.; Kempe, M.; Bosco, N.; Hacke, P.; Jordan, D.; Miller, D.
2013-09-01
Four levels of accelerated test standards for PV modules are described in the context of how the community can most quickly begin using these.
Online-to-Confidence-Set Conversions and Application to Sparse Stochastic Bandits
Pál, Dávid
., 2008), reinforcement learning (Bartlett and Tewari, 2009, Jaksch et al., 2010), or active learn- ing
Code verification and confidence-building (Technical Report) | SciTech
Office of Scientific and Technical Information (OSTI)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity of NaturalDukeWakefieldSulfate Reducing Bacteria (TechnicalTransmission,TextitSciTechinRequirementsor Heat1.10Report) |
Code verification and confidence-building (Technical Report) | SciTech
Office of Scientific and Technical Information (OSTI)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity of NaturalDukeWakefieldSulfate Reducing Bacteria (TechnicalTransmission,TextitSciTechinRequirementsor Heat1.10Report)
2011-08 "Restore User Confidence in the Risk Analysis, Communication,
Broader source: Energy.gov (indexed) [DOE]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity of Natural GasAdjustmentsShirleyEnergyTher i n c i p a l De p u t y A s s i s t a nsecondof2011
Method for Confidence Metric in Optic Disk Location in Retinal Images -
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power Administration wouldMass map shines light on dark matter ByMentor-ProtegeFromGasInnovation PortalEnergy
The measurement of attenuation from vertical seismic profiles
Davis, Francis Erwin
1983-01-01
and the calcareous content or the shales. Slightly to non-calcareous shales exhibrted the highest attenuation values. Calcareous to very calcareous shales; low porosity, cemented andstones; and limestones exhibited the lowest attenuatior values. No correlation... aligned on trough. VSP3 . . . 81 Figure 40. Cumulative attenuation and 90% confidence intervals. Downhole data. VSP3 83 Figure 41. Cumulative attenuation and 90% confidence intervals. Synthetic data. VSP3 85 Figure 42. Cumulative attenuation and 90...
Verification of unfold error estimates in the unfold operator code
Fehl, D.L.; Biggs, F.
1997-01-01
Spectral unfolding is an inverse mathematical operation that attempts to obtain spectral source information from a set of response functions and data measurements. Several unfold algorithms have appeared over the past 30 years; among them is the unfold operator (UFO) code written at Sandia National Laboratories. In addition to an unfolded spectrum, the UFO code also estimates the unfold uncertainty (error) induced by estimated random uncertainties in the data. In UFO the unfold uncertainty is obtained from the error matrix. This built-in estimate has now been compared to error estimates obtained by running the code in a Monte Carlo fashion with prescribed data distributions (Gaussian deviates). In the test problem studied, data were simulated from an arbitrarily chosen blackbody spectrum (10 keV) and a set of overlapping response functions. The data were assumed to have an imprecision of 5{percent} (standard deviation). One hundred random data sets were generated. The built-in estimate of unfold uncertainty agreed with the Monte Carlo estimate to within the statistical resolution of this relatively small sample size (95{percent} confidence level). A possible 10{percent} bias between the two methods was unresolved. The Monte Carlo technique is also useful in underdetermined problems, for which the error matrix method does not apply. UFO has been applied to the diagnosis of low energy x rays emitted by Z-pinch and ion-beam driven hohlraums. {copyright} {ital 1997 American Institute of Physics.}
Elias, Dwayne A.; Mukhopadhyay, Aindrila; Joachimiak, Marcin P.; Drury, Elliott C.; Redding, Alyssa M.; Yen, Huei-Che B.; Fields, Matthew W.; Hazen, Terry C.; Arkin, Adam P.; Keasling, Jay D.; Wall, Judy D.
2008-10-27
Hypothetical and conserved hypothetical genes account for>30percent of sequenced bacterial genomes. For the sulfate-reducing bacterium Desulfovibrio vulgaris Hildenborough, 347 of the 3634 genes were annotated as conserved hypothetical (9.5percent) along with 887 hypothetical genes (24.4percent). Given the large fraction of the genome, it is plausible that some of these genes serve critical cellular roles. The study goals were to determine which genes were expressed and provide a more functionally based annotation. To accomplish this, expression profiles of 1234 hypothetical and conserved genes were used from transcriptomic datasets of 11 environmental stresses, complemented with shotgun LC-MS/MS and AMT tag proteomic data. Genes were divided into putatively polycistronic operons and those predicted to be monocistronic, then classified by basal expression levels and grouped according to changes in expression for one or multiple stresses. 1212 of these genes were transcribed with 786 producing detectable proteins. There was no evidence for expression of 17 predicted genes. Except for the latter, monocistronic gene annotation was expanded using the above criteria along with matching Clusters of Orthologous Groups. Polycistronic genes were annotated in the same manner with inferences from their proximity to more confidently annotated genes. Two targeted deletion mutants were used as test cases to determine the relevance of the inferred functional annotations.
Richard Bowersox; John Hickman; Hannes Leetaru
2012-12-01
Part 1 of this report focuses on results of the western Kentucky carbon storage test, and provides a basis for evaluating injection and storage of supercritical CO{sub 2} in Cambro-Ordovician carbonate reservoirs throughout the U.S. Midcontinent. This test demonstrated that the Cambro- Ordovician Knox Group, including the Beekmantown Dolomite, Gunter Sandstone, and Copper Ridge Dolomite in stratigraphic succession from shallowest to deepest, had reservoir properties suitable for supercritical CO{sub 2} storage in a deep saline reservoir hosted in carbonate rocks, and that strata with properties sufficient for long-term confinement of supercritical CO{sub 2} were present in the deep subsurface. Injection testing with brine and CO{sub 2} was completed in two phases. The first phase, a joint project by the Kentucky Geological Survey and the Western Kentucky Carbon Storage Foundation, drilled the Marvin Blan No. 1 carbon storage research well and tested the entire Knox Group section in the open borehole Ã¢Â?Â? including the Beekmantown Dolomite, Gunter Sandstone, and Copper Ridge Dolomite Ã¢Â?Â? at 1152Ã¢Â?Â?2255 m, below casing cemented at 1116 m. During Phase 1 injection testing, most of the 297 tonnes of supercritical CO{sub 2} was displaced into porous and permeable sections of the lowermost Beekmantown below 1463 m and Gunter. The wellbore was then temporarily abandoned with a retrievable bridge plug in casing at 1105 m and two downhole pressure-temperature monitoring gauges below the bridge plug pending subsequent testing. Pressure and temperature data were recorded every minute for slightly more than a year, providing a unique record of subsurface reservoir conditions in the Knox. In contrast, Phase 2 testing, this study, tested a mechanically-isolated dolomitic-sandstone interval in the Gunter. Operations in the Phase 2 testing program commenced with retrieval of the bridge plug and long-term pressure gauges, followed by mechanical isolation of the Gunter by plugging the wellbore with cement below the injection zone at 1605.7 m, then cementing a section of a 14-cm casing at 1470.4Ã¢Â?Â?1535.6. The resultant 70.1-m test interval at 1535.6Ã¢Â?Â?1605.7 m included nearly all of the Gunter sandstone facies. During the Phase 2 injection, 333 tonnes of CO{sub 2} were injected into the thick, lower sand section in the sandy member of the Gunter. Following the completion of testing, the injection zone below casing at 1116 m in the Marvin Blan No. 1 well, and wellbore below 305 m was permanently abandoned with cement plugs and the wellsite reclaimed. The range of most-likely storage capacities found in the Knox in the Marvin Blan No. 1 is 1000 tonnes per surface hectare in the Phase 2 Gunter interval to 8685 tonnes per surface hectare if the entire Knox section were available including the fractured interval near the base of the Copper Ridge. By itself the Gunter lacks sufficient reservoir volume to be considered for CO{sub 2} storage, although it may provide up to 18% of the reservoir volume available in the Knox. Regional extrapolation of CO{sub 2} storage potential based on the results of a single well test can be problematic, although indirect evidence of porosity and permeability can be demonstrated in the form of active saltwater-disposal wells injecting into the Knox. The western Kentucky region suitable for CO{sub 2} storage in the Knox is limited updip, to the east and south, by the depth at which the base of the Maquoketa shale lies above the depth required to ensure storage of CO{sub 2} in its supercritical state and the deepest a commercial well might be drilled for CO{sub 2} storage. The resulting prospective region has an area of approximately 15,600 km{sup 2}, beyond which it is unlikely that suitable Knox reservoirs may be developed. Faults in the subsurface, which serve as conduits for CO{sub 2} migration and compromise sealing strata, may mitigate the area with Knox reservoirs suitable for CO{sub 2} storage. The results of the injection tes
ENERGY UTILIZATION AND ENVIRONMENTAL CONTROL TECHNOLOGIES IN THE COAL-ELECTRIC CYCLE
Ferrell, G.C.
2010-01-01
include energy recovery, sulfur removal, coal fines, S02U.S. coals range from 85 to 95 percent energy recovery withCoal Handling and Preparation Preheaters and Dissolvers Mineral Separation (Filters) Solvent Recovery
Readout of Secretary Chu Meetings on Carbon Capture and Sequestration...
Broader source: Energy.gov (indexed) [DOE]
a conventional power plant by as much as 95 percent. The International Energy Agency (IEA) estimates that CCS can account for 20 percent of global mitigation by 2050. The...
Boyer, Elizabeth W.
using ethanol rather than methanol, the resulting molecules are"fatty acid ethyl esters hand, consists of about 95 percent saturated hydrocar- bons and 5 percent aromatic compounds.1
4 2015-2016 CSULB Catalog Welcome to the California State University (CSU) the
Sorin, Eric J.
of excellence Since 1961, the CSU has provided an affordable, accessible, and high-quality education to nearly from the CSU. · The CSU awards 95 percent of the hospitality/tourism degrees in the state. · Nearly
Environments Journal of Arid Environments 69 (2007) 633657
Ahmad, Sajjad
2007-01-01
; Bioturbation; Ecohydrology; Hydraulic conductivity; Mojave Desert ARTICLE IN PRESS www hydraulic properties. Separate measurements were made in shrub undercanopy and intercanopy microsites horizons in intercanopy soils in which saturated hydraulic conductivity (Ksat) decreased 95 percent from
Economic implications of natural gas vehicle technology in U.S. private automobile transportation
Kragha, Oghenerume Christopher
2010-01-01
Transportation represents almost 28 percent of the United States' energy demand. Approximately 95 percent of U.S. transportation utilizes petroleum, the majority of which is imported. With significant domestic conventional ...
Moore, Carmel; Sambrook, Jennifer; Walker, Matthew; Tolkien, Zoe; Kaptoge, Stephen; Allen, David; Mehenny, Susan; Mant, Jonathan; Di Angelantonio, Emanuele; Thompson, Simon G.; Ouwehand, Willem; Roberts, David J.; Danesh, John
2014-09-17
includes a minimisa- tion algorithm to ensure that key prognostic characteris- tics are balanced across the trial arms at baseline (for example, new/repeat donor status, weight and age, as shown in Table 2). Randomisation was stratified by do- nation centre... -axial accelerometer AX3 (Axivity, York, UK) to measure the impact of more frequent blood donations Serious adverse events, diagnosed/ Heart problems including heart charge of a vehicle) New illness Diagnoses of low iron Diagnosis of low haemoglobin by NHSBT Diagnosis...
that planetary warming is well underway, the climate research community looks to palaeoclimate research for a ground-truthing measure with which to test the accuracy of future climate simulations. Model experiments be considered in such an exercise. The most recent period of sustained global warmth similar to what
Studer, Bettina; Limbrick-Oldfield, Eve H.; Clark, Luke
2014-10-28
reverse. In the context of sports, if a player scores with three successive shots, spectators tend to expect the player to score with their next attempt; this was originally described in basketball and labeled the “hot hand” belief (Alter & Oppenheimer... then classified as higher (1) or lower (0) than the individual participant’s aver- age. Trial-by-trial data was analyzed using logistic regression in R (R Core Team, Vienna, Austria). Three primary logistic regression models were created. Model 1 tested...
Forrest, Claire L
2009-07-03
Previous research has established that feeling of another’s knowing (FOAK) can be judged from the paralinguistic cues present in speakers’ utterances (Brennan & Williams, 1995). The current study investigates whether this ...
Broader source: Energy.gov [DOE]
The National Renewable Energy Laboratory (NREL) has published protocols for estimating energy savings for residential and commercial energy efficiency programs and measures through the recently released “The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures.”
National Nuclear Security Administration (NNSA)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity of NaturalDukeWakefield Municipal GasAdministration Medal01 Sandia4) AugustA. -71-3Overview ofAgreementsMartynUse of
Fourier Analytic Approach to Phase Estimation
Hiroshi Imai; Masahito Hayashi
2008-10-31
For a unified analysis on the phase estimation, we focus on the limiting distribution. It is shown that the limiting distribution can be given by the absolute square of the Fourier transform of $L^2$ function whose support belongs to $[-1,1]$. Using this relation, we study the relation between the variance of the limiting distribution and its tail probability. As our result, we prove that the protocol minimizing the asymptotic variance does not minimize the tail probability. Depending on the width of interval, we derive the estimation protocol minimizing the tail probability out of a given interval. Such an optimal protocol is given by a prolate spheroidal wave function which often appears in wavelet or time-limited Fourier analysis. Also, the minimum confidence interval is derived with the framework of interval estimation that assures a given confidence coefficient.
Kreinovich, Vladik
century. The result was the discovery of many large relatively easy to locate resources such as the oil: A Brief Descrip- tion In evaluations of natural resources and in the search for natural resources from the Earth, such as fossil fuels (oil, coal, natural gas), minerals, and water. Our need
Kreinovich, Vladik
century. The result was the discovery of many large relatively easy to locate resources such as the oil: A Brief DescripÂ tion In evaluations of natural resources and in the search for natural resources from the Earth, such as fossil fuels (oil, coal, natural gas), minerals, and water. Our need
Reliable Computing 1 (2) (1995), pp. 141-172 Applications of interval computations to
Kearfott, R. Baker
1995-01-01
HKI!Ilfl B. KPElaHOBW-I, A- HEMI/IP, E. FYTr/I~PPEC Ontm H30CHOBHtaXtlCTOqHHKOBpa3pymeHHfl npH 3e~KeT 6blTb nprlMeHerla, "-r'rorbl npeaoTBpa'rnTs Brt6paurtoHHoepa3pymeHrle B 6oat,mHx a3po
Timing analysis of logic=level digital circuits using uncertainty intervals
Bell, Joshua Asher
1996-01-01
Timing Analysis of Logic-Level Digital Circuits Using Competitive design of modem digital circuits requires high performance at reduced cost and time-to-market. Timing analysis is increasingly used to deal with the more aggressive timing constraints...
Chacron, Maurice
through the comparison of two simple firing models, one of which is a renewal process while the other all second and higher order ISI correla- tions are zero are called renewal processes [6]. However information transfer is presently unknown due to the LIFDT model's complexity and to the memory carried
Woerner, Kyle
2014-01-01
High contact density environments are becoming ubiquitous in autonomous marine vehicle (AMV) operations. Safely managing these environments and their mission greatly taxes platforms. AMV collisions will likely increase as ...
RIGOROUS INVESTIGATIONS OF PERIODIC ORBITS IN AN ELECTRONIC CIRCUIT BY MEANS OF INTERVAL METHODS
Galias, Zbigniew
Zbigniew Galias Department of Electrical Engineering, University of Mining and Metallurgy al. Mickiewicza
Proving the existence of long periodic orbits in 1D maps using interval Newton method
Galias, Zbigniew
shooting Zbigniew Galias Department of Electrical Engineering, University of Mining and Metallurgy, al
PROVING THE EXISTENCE OF PERIODIC SOLUTIONS USING GLOBAL INTERVAL NEWTON METHOD
Galias, Zbigniew
Department of Electrical Engineering, University of Mining and Metallurgy al. Mickiewicza 30, 30Â059 Krak of Scientific Research KBN, grant no. 0449/P3/94/06 and by University of Mining and Metallurgy, grant no. 10
Detecting Duplicates in Geoinformatics: from Intervals and Fuzzy Numbers to General MultiD
Kreinovich, Vladik
,raraiza,vladik}@utep.edu Hung T. Nguyen Department of Mathematical Sciences New Mexico State University Las Cruces, NM 88003 gravity measurements collected throughout the United States and Mexico and parts of Africa records. Why duplicates are a problem. Duplicate values can corrupt the results of statistical data
Detecting Duplicates in Geoinformatics: from Intervals and Fuzzy Numbers to General Multi-D
Kreinovich, Vladik
,raraiza,vladik}@utep.edu Hung T. Nguyen Department of Mathematical Sciences New Mexico State University Las Cruces, NM 88003 throughout the United States and Mexico and parts of Africa. The geophysical use of gravity database compiled values can corrupt the results of statistical data processing and analysis. For example, when instead
Baykara, N. A.; Guervit, Ercan; Demiralp, Metin
2012-12-10
In this work a study on finite dimensional matrix approximations to products of quantum mechanical operators is conducted. It is emphasized that the matrix representation of the product of two operators is equal to the product of the matrix representation of each of the operators when all the fluctuation terms are ignored. The calculation of the elements of the matrices corresponding to the matrix representation of various operators, based on three terms recursive relation is defined. Finally it is shown that the approximation quality depends on the choice of higher values of n, namely the dimension of Hilbert space.
Optimal Sojourn Time Control within an Interval1 Jianghai Hu and Shankar Sastry
Sastry, S. Shankar
of Electrical Engineering and Computer Sciences University of California at Berkeley Berkeley, CA 94720 the following scenario. Suppose that there are three consecutive cars driving in the same direction on a road, numbered 1, 2, and 3 from front to end. The body length of each car is 1This material is based upon work
Droplet Nucleation and Domain Wall Motion in a Bounded Interval Robert S. Maier
Maier, Robert S.
magnetization. In the weak-noise limit, noise-activated magnetization reversals become exponentially rare, the reversal rate being given by the Kramers formula "!$#&% ')(1032547698 . Here 6 is the noise strength, 0 study a spatially extended model of noise-induced magne- tization reversal: a classical Ginzburg
Calibration Monitoring for Sensor Calibration Interval Extension: Gaps in the Current Science Base
Coble, Jamie B.; Ramuhalli, Pradeep; Meyer, Ryan M.; Hashemian, Hash; Shumaker, Brent; Cummins, Dara
2012-10-09
Currently in the United States, periodic sensor recalibration is required for all safety-related sensors, typically occurring at every refueling outage, and it has emerged as a critical path item for shortening outage duration in some plants. International application of calibration monitoring has shown that sensors may operate for longer periods within calibration tolerances. This issue is expected to also be important as the United States looks to the next generation of reactor designs (such as small modular reactors and advanced concepts), given the anticipated longer refueling cycles, proposed advanced sensors, and digital instrumentation and control systems. Online monitoring (OLM) can be employed to identify those sensors that require calibration, allowing for calibration of only those sensors that need it. The U.S. Nuclear Regulatory Commission (NRC) accepted the general concept of OLM for sensor calibration monitoring in 2000, but no U.S. plants have been granted the necessary license amendment to apply it. This paper summarizes a recent state-of-the-art assessment of online calibration monitoring in the nuclear power industry, including sensors, calibration practice, and OLM algorithms. This assessment identifies key research needs and gaps that prohibit integration of the NRC-approved online calibration monitoring system in the U.S. nuclear industry. Several technical needs were identified, including an understanding of the impacts of sensor degradation on measurements for both conventional and emerging sensors; the quantification of uncertainty in online calibration assessment; determination of calibration acceptance criteria and quantification of the effect of acceptance criteria variability on system performance; and assessment of the feasibility of using virtual sensor estimates to replace identified faulty sensors in order to extend operation to the next convenient maintenance opportunity.
Pattern Graphs: Combining Multivariate Time Series and Labelled Interval Sequences for
Berthold, Michael R.
.500) gear down [1,*] ? ? [1,*] [1,*] [1,*] [1,*] low revolutions [1,*] [1,*] ? [1,50] [1,*] gear up ? [1,50] ? [1,50] [1,*] ? [1,*] [1,*] [1,*]? [1,*] middle revolutions ¬gear up ¬gear down ¬gear up ¬gear down [1,*] gear up Fig. 1 Example of a pattern graph describing a driving cycle (learned from data, see [10
INDEX TO VOLUME 190/196 560-mbsf Fractured Interval, structures, 196A4:21
and 196 of the Pro- ceedings of the Ocean Drilling Program (published as separate leg-specific books was prepared by Earth Systems, under subcontract to the Ocean Drilling Program. The index contains two:74 Antarctic Bottom Water, clay, 190/196B4:8 Antarctic Intermediate Water, clay, 190/196B4:8 Ashizuri Transect
of the signals at unit resolution. C3H3 + at m/z 39, which made up about 5% of the total organic signal]. In this comparison, a factor (OM:OC 1.7 [Fuzzi et al., 2007]) is used by us for the conversion of OC reported-folding conversion of 1.2 days from hydrophobic to hydrophilic [Cooke et al., 1999]. Organic particle mass loading
Normal-Based Methods for a Gamma Distribution: Prediction and Tolerance Intervals
Krishnamoorthy, Kalimuthu
of contamination (e.g., landfill by a waste management facility, hazardous material storage facility, or factory found a number of applications in occupational and industrial hygiene. In a recent article, Maxim et al
Finding limiting flows of batch extractive distillation with interval Erika R. Fritsa,b*
Csendes, Tibor
., Hungary, e-mail: ufo@mail.bme.hu b HAS BUTE Research Group of Technical Chemistry, H-1521 Budapest, P
A Review of Sensor Calibration Monitoring for Calibration Interval Extension in Nuclear Power Plants
Coble, Jamie B.; Meyer, Ryan M.; Ramuhalli, Pradeep; Bond, Leonard J.; Hashemian, Hash; Shumaker, Brent; Cummins, Dara
2012-08-31
Currently in the United States, periodic sensor recalibration is required for all safety-related sensors, typically occurring at every refueling outage, and it has emerged as a critical path item for shortening outage duration in some plants. Online monitoring can be employed to identify those sensors that require calibration, allowing for calibration of only those sensors that need it. International application of calibration monitoring, such as at the Sizewell B plant in United Kingdom, has shown that sensors may operate for eight years, or longer, within calibration tolerances. This issue is expected to also be important as the United States looks to the next generation of reactor designs (such as small modular reactors and advanced concepts), given the anticipated longer refueling cycles, proposed advanced sensors, and digital instrumentation and control systems. The U.S. Nuclear Regulatory Commission (NRC) accepted the general concept of online monitoring for sensor calibration monitoring in 2000, but no U.S. plants have been granted the necessary license amendment to apply it. This report presents a state-of-the-art assessment of online calibration monitoring in the nuclear power industry, including sensors, calibration practice, and online monitoring algorithms. This assessment identifies key research needs and gaps that prohibit integration of the NRC-approved online calibration monitoring system in the U.S. nuclear industry. Several needs are identified, including the quantification of uncertainty in online calibration assessment; accurate determination of calibration acceptance criteria and quantification of the effect of acceptance criteria variability on system performance; and assessment of the feasibility of using virtual sensor estimates to replace identified faulty sensors in order to extend operation to the next convenient maintenance opportunity. Understanding the degradation of sensors and the impact of this degradation on signals is key to developing technical basis to support acceptance criteria and set point decisions, particularly for advanced sensors which do not yet have a cumulative history of operating performance.
Herrin, D. G.
2007-01-01
the types of data acquisition equipment and systems available and the different components of a data. Lastly, actual graphs of data will be presented to demonstrate how to dissect and analyze a data set and then implement measures that will optimize...
The maximum time interval of time-lapse photography for monitoring construction operations
Choi, Ji Won
2005-11-01
sec. 40 sec. 60 sec 3 min. 5 min. 10 min. E-B1 10.42% 11.47% 13.08% 17.00% 18.78% 18.75% 32.50% 47.03% 49.67% E-B2 5.36% 6.58% 12.14% 16.19% 12.69% 30.17% 27.97% 44.81% 35.97% E-H1 6.83% 12.08% 16.28% 19.92% 31.33% 25.22% 30.50% 39.64% 20.81% E-H2...
How to Estimate Expected Shortfall When Probabilities Are Known with Interval or Fuzzy
Kreinovich, Vladik
the hurricane Katrina devastated New Orleans, why in 2011, Fukushima nuclear power station in Japan the record of historic floods, tsunamis, hurricanes, earthquakes, and other natural disasters to estimate was destroyed by an unusually high tsunami, etc. Since we cannot have a threshold s0 that would guarantee
Structural and functional characterization of the polled interval on bovine chromosome 1
Wunderlich, Kris Rakowitz
2008-10-10
quantity) and SE for expresion in samples from 5 to 6 mo old calves................................................ 63 4.3 Means (relative quantity) and SE for expresion in samples from 1 to 8 d old calves by sex that difered (P < 0..., whereas high levels of BMP2 and BMP4 induce a chondrogenic fate. Diferentiation into chondrocyte-like osteoblasts is regulated by both IH and PTHrP activities. (Abzhanov et al., 207) 4 4 Figure 1.2. Sequence of events of chondrogenesis...
A novel approach to determine post mortem interval using neutron radiography
Bilheux, Hassina Z; Cekanova, Maria; Vass, Arpad Alexander; Nichols, Trent L; Bilheux, Jean-Christophe; Donnell, Robert; Finocchiaro, Vincenzo
2015-01-01
In this study, neutron radiography (NR) is used non-destructively to measure changes in hydrogen (H) content in decaying tissues as a mean to estimate post-mortem invertal (PMI). After death, tissue undergoes sequential changes consisting of organic and inorganic phase variations, as well as a gradual reduction of tissue water content. H is the primary contributor to NR contrast in biological specimens because (1) it is the most abundant element in biological tissues and (2) its nucleus scatter thermal and cold neutrons more strongly than any other atomic nucleus. These contrast differences can be advantageous in a forensic context to determine small changes in hydrogen concentrations. Dog cadavers were used as a model for human cadavers. Canine tissues and cadavers were exposed to controlled (laboratory settings) and uncontrolled (University of Tennessee Anthropology Research Facility) environmental conditions during putefraction, respectively. Neutron radiographs were supplemented with photographs and histology data to assess the decomposition stage of cadavers. Results demonstrated that the increase in neutron transmission likely corresponded to a decrease in hydrogen content in the tissue, which was correlated with the time of decay of the tissue. Tissues depleted in hydrogen are brighter in the neutron transmission radiographs of skeletal muscles, lung, and bone, under controlled conditions. Over a period of 10 days, changes in neutron transmission through lung and muscle were found to be higher than bone by 8.3%, 7.0 %, and 2.0 %, respectively. Estimation of the PMI was calculated from a natural logarithmic fitting of the NR data. Under controlled conditions, estimation of the PMI was 70% and 63.9 % accurate for bone and lung tissues, while being 1.4% accurate for muscle tissue. All results underestimated the true PMI. In conclusion, neutron radiography can be used for detection of hydrogen changes in decaying tissues to estimate PMI.
How to Test Hypotheses When Exact Values are Replaced by Intervals to Protect Privacy: Case of
Kreinovich, Vladik
.g., [6]. All versions of the t-test are based on sample means X = 1 nx Â· nx i=1 xi and Y = 1 ny Â· ny i=1 yi and sample variances s2 X = 1 nx - 1 Â· nx i=1 (xi - X)2 and s2 Y = 1 ny - 1 Â· ny i=1 (yi - Y )2 : Â· For testing that the actual mean Âµ is Âµ0, we use t = X - Âµ0 sX/ nx . Â· For testing that the means are equal
followed by an additional 10 hours with a 120-s switch interval at 2C.
Spies, Thomas
was isolated and assayed essentially as de- scribed (33). The orc5-1 cultures were grown to A600 0.1 and either ) were harvested and lysed for each hourly time point. 25. The orc5 ts alleles were created by in vitro mutagen- esis of the cloned ORC5 gene by the polymerase chain reaction (PCR). PCR-mutagenized DNA pools
Convergence Properties of an Interval Probabilistic Approach to System Reliability Estimation
Kreinovich, Vladik
reactor, the list of such characteristics include neutron flux, temperature, etc. We assume. For example, a reactor shell can come from three different manufacturing plants, and we know the frequencies with which they come from different plants, i.e., the probabilities that a randomly selected shell is from
Convergence Properties of an Interval Probabilistic Approach to System Reliability Estimation
Kreinovich, Vladik
certain characteristics y = D y (1) ; y (2) ; : : : ; y (m) E ; e.g., for a nuclear reactor, the list of each such situation. For example, a reactor shell can come from three different manufacturing plants, and we know the frequencies with which they come from different plants, i.e., the probabilities
Targeted deletion of the 9p21 noncoding coronary artery disease risk interval in mice
Visel, Axel
2010-01-01
2588bp Suppl. Table 4 – Primer sequences and coordinates of17 Suppl. Table 4 – Primer sequences and coordinates of
Decision Making under Interval and Fuzzy Uncertainty: Towards an Operational Approach
Kreinovich, Vladik
State Oil Academy, Baku, Azerbaijan raliev@asoa.edu.az, oleg huseynov@yahoo.com 2 Azerbaijan Association assumption that for each two alterna- tives, a user can always meaningfully decide which of them. Traditional decision theory is based on a simplifying assumption that for each two alternatives, a user can
Case Studies in Using Whole Building Interval Data to Determine Annualized Electrical Savings
Effinger, M.; Anthony, J.; Webster, L.
2009-01-01
% to 114% when compared to the measured method results. INTRODUCTION The use of whole building data to develop energy models as a method to ascertain energy savings has been researched for many years. This method has been detailed in the IPMVP... adoption of Option C is the length of monitoring time required to develop reliable regression models. To be IPMVP adherent, both pre- and post-implementation data must be collected over a period that covers the full reporting period (IPMVP, 2007...
Indication of multiscaling in the volatility return intervals of stock markets Fengzhong Wang,1
Stanley, H. Eugene
of financial markets has long been a focus of economics and econophysics research 19 . Study- ing recently, some related studies on financial markets, such as escape time 30 , exit time 31,32 , first and nonlinear features 36 . Recent studies 3739 of stock markets show that the distribution of activ- ity
Multifactor analysis of multiscaling in volatility return intervals Fengzhong Wang,1
Stanley, H. Eugene
markets 1721 show the following, for both daily and intraday data. i The distribution of the scaled and earthquakes 1315 . Also there are some related studies on financial markets, such as first passage time 25 Yamasaki,1,2 Shlomo Havlin,1,3 and H. Eugene Stanley1 1 Center for Polymer Studies and Department
THE SIZE OF EXPONENTIAL SUMS ON INTERVALS OF THE REAL LINE
Erdélyi, Tamás
| Mjµ , |a0| = 1 , n N , where the exponents j R satisfy 0 = 0 , j j > 0 , j = 1, 2's conjecture, Konyagin's conjecture, Uhrig protocol, decoupling methods, quantum coherence, multi-pulse control
Reliable Computing 2 (1) (1996), pp. 47-62 interval arithmeticVariable-precision,
Kearfott, R. Baker
1996-01-01
'rttKy llepeMeHHo/'t pa3p~anoc'rtt. 1. Introduction Roundoff error and catastrophic cancelation in scientific
File:Table for Tip Speed Intervals of Length.pdf | Open Energy Information
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page on Google Bookmark EERE: Alternative Fuels Data Center Home Page on QA:QA J-E-1 SECTION J APPENDIX ECoopButtePowerEdisto ElectricMonasterwindCAESRAPID Toolkit Flyer.pdfSkystreamTable for
Gasoline and Diesel Fuel Update (EIA)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity of Natural GasAdjustments (Billion Cubic Feet) Wyoming Dry NaturalPrices1 Table 1.10 CoolingNotesShale natural2)An Assessment
John McCord
2006-05-01
The Phase II Frenchman Flat groundwater flow model is a key element in the ''Federal Facility Agreement and Consent Order'' (FFACO) (1996) corrective action strategy for the Underground Test Area (UGTA) Frenchman Flat corrective action unit (CAU). The objective of this integrated process is to provide an estimate of the vertical and horizontal extent of contaminant migration for each CAU to predict contaminant boundaries. A contaminant boundary is the model-predicted perimeter that defines the extent of radionuclide-contaminated groundwater from underground testing above background conditions exceeding the ''Safe Drinking Water Act'' (SDWA) standards. The contaminant boundary will be composed of both a perimeter boundary and a lower hydrostratigraphic unit (HSU) boundary. The computer model will predict the location of this boundary within 1,000 years and must do so at a 95 percent level of confidence. Additional results showing contaminant concentrations and the location of the contaminant boundary at selected times will also be presented. These times may include the verification period, the end of the five-year proof-of-concept period, as well as other times that are of specific interest. This report documents the development and implementation of the groundwater flow model for the Frenchman Flat CAU. Specific objectives of the Phase II Frenchman Flat flow model are to: (1) Incorporate pertinent information and lessons learned from the Phase I Frenchman Flat CAU models. (2) Develop a three-dimensional (3-D), mathematical flow model that incorporates the important physical features of the flow system and honors CAU-specific data and information. (3) Simulate the steady-state groundwater flow system to determine the direction and magnitude of groundwater fluxes based on calibration to Frenchman Flat hydrogeologic data. (4) Quantify the uncertainty in the direction and magnitude of groundwater flow due to uncertainty in parameter values and alternative component conceptual models (e.g., geology, boundary flux, and recharge).
Double-Shell Tank Visual Inspection Changes Resulting from the Tank 241-AY-102 Primary Tank Leak
Girardot, Crystal L. [Washington River Protection Solutions, Richland, WA (United States); Washenfelder, Dennis J. [Washington River Protection Solutions, Richland, WA (United States); Johnson, Jeremy M. [USDOE Office of River Protection, Richland, WA (United States); Engeman, Jason K. [Washington River Protection Solutions, Richland, WA (United States)
2013-11-14
As part of the Double-Shell Tank (DST) Integrity Program, remote visual inspections are utilized to perform qualitative in-service inspections of the DSTs in order to provide a general overview of the condition of the tanks. During routine visual inspections of tank 241-AY-102 (AY-102) in August 2012, anomalies were identified on the annulus floor which resulted in further evaluations. In October 2012, Washington River Protection Solutions, LLC determined that the primary tank of AY-102 was leaking. Following identification of the tank AY-102 probable leak cause, evaluations considered the adequacy of the existing annulus inspection frequency with respect to the circumstances of the tank AY-102 1eak and the advancing age of the DST structures. The evaluations concluded that the interval between annulus inspections should be shortened for all DSTs, and each annulus inspection should cover > 95 percent of annulus floor area, and the portion of the primary tank (i.e., dome, sidewall, lower knuckle, and insulating refractory) that is visible from the annulus inspection risers. In March 2013, enhanced visual inspections were performed for the six oldest tanks: 241-AY-101, 241-AZ-101,241-AZ-102, 241-SY-101, 241-SY-102, and 241-SY-103, and no evidence of leakage from the primary tank were observed. Prior to October 2012, the approach for conducting visual examinations of DSTs was to perform a video examination of each tank's interior and annulus regions approximately every five years (not to exceed seven years between inspections). Also, the annulus inspection only covered about 42 percent of the annulus floor.
Endeshaw, Tekola; Gebre, Teshome; Ngondi, Jeremiah; Graves, Patricia M.; Shargie, Estifanos B.; Ejigsemahu, Yeshewamebrat; Ayele, Berhan; Yohannes, Gedeon; Teferi, Tesfaye; Messele, Ayenew; Zerihun, Mulat; Genet, Asrat; Mosher, Aryc W.; Emerson, Paul M.; Richards, Frank O. Jr
2008-07-03
of whom 11,504 (82%) were included in the analysis. Overall slide positivity rate was 4.1% (95% confidence interval [CI] 3.4–5.0%) while ParaScreen RDT was positive in 3.3% (95% CI 2.6–4.1%) of those tested. Considering microscopy as the gold standard...
Diversity, Institutions and Economic Outcomes
Santacreu Vasut, Estefania
2010-01-01
Sum N. 0 CCI N. 1 N. -1 WEO TW P. 1 A.P 1 P. -1 A.P -1 P.Confidence Interval WEO:Number of Witnesses expressingProportion,relative to WEO All proportions are rounded up to
BOOK REVIEWS 143 attention to covariate-conditioned average and quantile effects, along with as-
Krishnamoorthy, Kalimuthu
confidence intervals and testing hypotheses. Their method is applied to a study of the impact of crude oil prices on gasoline prices. Hendry developed the general-to-specific (GETS) procedure for model se, for example, asset prices, whose marginal distributions display strong nonnormal features such as skewness and
Prenatal Exposure to Nitrates, Nitrites, Nitrosatable Drugs, and Small-For-Gestational-Age Births
Shinde, Mayura
2013-11-27
] 1.4 [95% confidence interval [CI] 1.0, 2.1]). This association was stronger among full term SGA births (OR 1.6 [95% CI 1.1, 2.3]). Dietary nitrites modified the associations between nitrosatable drugs and SGA but lower odds of SGA were observed among...
The University of Chicago Department of Statistics
The University of Chicago Department of Statistics Statistics Colloquium LISA LENDWAY Department of Statistics University of Minnesota Using the Bootstrap to Teach Confidence Intervals in an Introductory Statistics Course THURSDAY, February 2, 2012, at 12:00 PM 110 Eckhart Hall, 5734 S. University Avenue
Probabilities of Possible Future Prices (Released in the STEO April 2010)
Reports and Publications (EIA)
2010-01-01
The Energy Information Administration introduced a monthly analysis of energy price volatility and forecast uncertainty in the October 2009 Short-Term Energy Outlook (STEO). Included in the analysis were charts portraying confidence intervals around the New York Mercantile Exchange (NYMEX) futures prices of West Texas Intermediate (equivalent to light sweet crude oil) and Henry Hub natural gas contracts.
Environmental variability and the ecological effects of spawning Pacific salmon on stream biofilm
Tiegs, Scott
, combining data from different North Pacific Rim ecoregions inflated the confidence interval as comparedEnvironmental variability and the ecological effects of spawning Pacific salmon on stream biofilm of organisms delivering resource subsidies, such as ecosystem engineering by Pacific salmon spawners
J Neurol (200) 25:000000 DOI 10.1007/s00415-008-0865-z ORIGINAL COMMUNICATION
Timmer, Jens
Academic Neurosurgery Unit PO Box 167 Addenbrooke's Hospital Cambridge CB2 2QQ, UK Abstract Dynamic, carotid re- canalization without prior event, death or study end. Transcranial Doppler sonography was used predictive effect on ipsilateral ischemic events for impaired Dx (rate ratio 8.2 [95% confidence interval 1
1.0 Introduction 1.1 What is Visual Sample Plan?
the Mean 3.2.3 Construct Confidence Interval on Mean 3.2.4 Compare Proportion to Fixed Threshold 3.2.5 Compare Proportion to Reference Proportion 3.2.6 Estimate the Proportion 3.2.7 Locating a Hot Spot 3
September 2004 1.1 Visual Sample Plan Version 3.0 1.0 Introduction
.2.2 Estimate the Mean 3.2.3 Construct Confidence Interval on Mean 3.2.4 Compare Proportion to Fixed Threshold 3.2.5 Compare Proportion to Reference Proportion 3.2.6 Estimate the Proportion 3.2.7 Locating a Hot Spot 3
March 2014 Visual Sample Plan Version 7.01.1 1.0 Introduction
.2.11 Compare Proportion to Fixed Threshold 3.2.12 Compare Proportion to Reference Proportion 3.2.13 Construct Confidence Interval on Proportion 3.2.14 Estimate the Proportion 3.2.15 Establish Boundary of Contamination 3
September 2007 Visual Sample Plan Version 5.01.1 1.0 Introduction
the Mean 3.2.3 Construct Confidence Interval on Mean 3.2.4 Compare Proportion to Fixed Threshold 3.2.5 Compare Proportion to Reference Proportion 3.2.6 Estimate the Proportion 3.2.7 Locating a Hot Spot 3
Automatic Audio Segmentation: Segment Boundary and Structure Detection in
Rauber,Andreas
etc. This information can be used to create representative song excerpts or summaries, to facilitate confidence intervals and use a large groundtruth corpus which contains 94 songs of various genres. Final, HTML reports and source code can be accessed on the web1 . 2 Related work Foote [Foo00] was the first
Original Contribution Elevated Lung Cancer in Younger Adults and Low Concentrations of Arsenic
California at Berkeley, University of
Original Contribution Elevated Lung Cancer in Younger Adults and Low Concentrations of Arsenic-term effects of arsenic. We performed a case-control study of lung cancer from 2007 to 2010 in areas or more years ago resulted in odds ratios for lung cancer of 1.00, 1.43 (90% confidence interval: 0.82, 2
Walsh, Michael; Merkel, Peter A.; Peh, Chen Au; Szpirt, Wladimir; Guillevin, Loïc; Pusey, Charles D.; deZoysa, Janak; Ives, Natalie; Clark, William F.; Quillen, Karen; Winters, Jeffrey L.; Wheatley, Keith; Jayne, David; PEXIVAS Investigators
2013-03-14
from, but stratified by, standard vs. reduced dose glucocorticoid regimen and vice versa). Point estimates and their corre- sponding 95% confidence intervals and p values will be calculated for all estimates of effect. A p value of charges • Immediate publication on acceptance • Inclusion...
Stanford University
for lifetimes between 30-100 years, with a 90% confidence interval of 98-1200 MWth. Lumped parameter modeling the past 20 years. INTRODUCTION The OBGA comprises the regions of low temperature geothermal activityPROCEEDINGS, Thirty-Sixth Workshop on Geothermal Reservoir Engineering Stanford University
San Diego State University Department of Psychology Spring 2009
Gallo, Linda C.
). Research Methods and Statistics. New York: Harcourt College Publishers. Huck, S.W. (2008). Reading, procedures, and techniques used to conduct empirical research. Design of research will be covered including, confidence interval estimates, post hoc analyses, power estimation, correlation, regression, and statistical
Mikael Kuusela; Victor M. Panaretos
2015-07-13
We consider the high energy physics unfolding problem where the goal is to estimate the spectrum of elementary particles given observations distorted by the limited resolution of a particle detector. This important statistical inverse problem arising in data analysis at the Large Hadron Collider at CERN consists in estimating the intensity function of an indirectly observed Poisson point process. Unfolding typically proceeds in two steps: one first produces a regularized point estimate of the unknown intensity and then uses the variability of this estimator to form frequentist confidence intervals that quantify the uncertainty of the solution. In this paper, we propose forming the point estimate using empirical Bayes estimation which enables a data-driven choice of the regularization strength through marginal maximum likelihood estimation. Observing that neither Bayesian credible intervals nor standard bootstrap confidence intervals succeed in achieving good frequentist coverage in this problem due to the inherent bias of the regularized point estimate, we introduce an iteratively bias-corrected bootstrap technique for constructing improved confidence intervals. We show using simulations that this enables us to achieve nearly nominal frequentist coverage with only a modest increase in interval length. The proposed methodology is applied to unfolding the Z boson invariant mass spectrum as measured in the CMS experiment at the Large Hadron Collider.
, secure, and affordable energy while improving energy efficiency and preserving resources and lecturers allows students to learn from the world's leading architects, urban designers, landscape Energy Research (WISER) · The mission of WISER is to continue to improve the quality of life while
Pandey, Tulsi Ram
1993-01-01
to external goods intensified after 1950s with the opening of rural interiors through transportation development. In 1951, Nepal only had a total of 376 kilometers of motorable roads of any kind. This increased to 6,306 kilometers in 1986 (CBS 1988: 119... of the 1964 Lands Act was able to raise an equivalent of USdoliar 15 million within four years of its implementation (Baskota and Lohani 1985: 101). Unfortunately the administrative machinary assigned to dispense credit was inelfecient in recording...
Doucet, M.; Landrieu, M.; Montgomery, R.; O' Donnell, B.
2007-07-01
AREVA NP as a worldwide PWR fuel provider has to have a fleet of fresh UO{sub 2} shipping casks being agreed within a lot of countries including USA, France, Germany, Belgium, Sweden, China, and South Africa - and to accommodate foreseen EPR Nuclear Power Plants fuel buildings. To reach this target the AREVA NP Fuel Sector decided to develop an up-to-date shipping cask (so called MAP project) gathering experience feedback of the today fleet and an improved safety allowing the design to comply with international regulations (NRC and IAEA) and local Safety Authorities. Based on pre design features a safety case was set up to highlight safety margins. Criticality hypothetical accidental assumptions were defined: - Preferential flooding; - Fuel rod lattice pitch expansion for full length of fuel assemblies; - Neutron absorber penalty; -... Well known computer codes, American SCALE package and French CRISTAL package, were used to check configurations reactivity and to ensure that both codes lead to coherent results. Basic spectral calculations are based on similar algorithms with specific microscopic cross sections ENDF/BV for SCALE and JEF2.2 for CRISTAL. The main differences between the two packages is on one hand SCALE's three dimensional fuel assembly geometry is described by a pin by pin model while an homogenized fuel assembly description is used by CRISTAL and on the other hand SCALE is working with either 44 or 238 neutron energy groups while CRISTAL is with a 172 neutron energy groups. Those two computer packages rely on a wide validation process helping defining uncertainties as required by regulations in force. The shipping cask with two fuel assemblies is designed to maximize fuel isolation inside a cask and with neighboring ones even for large array configuration cases. Proven industrial products are used: - Boral{sup TM} as neutron absorber; - High density polyethylene (HDPE) or Nylon as neutron moderator; - Foam as thermal and mechanical protection. The cask is designed to handle the complete AREVA NP fuel assembly types from the 14x14 to the 18x18 design with a {sup 235}U enrichment up to 5.0% enriched natural uranium (ENU) and enriched reprocessed uranium (ERU). After a brief presentation of the computer codes and the description of the shipping cask, calculation results and comparisons between SCALE and CRISTAL will be discussed. (authors)
Boyce, J. W.
The newly developed laser microprobe (U-Th)/He thermochronometer permits, for the first time, the ability to generate precise (U-Th)/He cooling ages for even very young (<1 Ma) samples with a spatial resolution on the order ...
Cohen, Daniel Allen
2009-01-01
a teacher with good classroom management skills. TheseBenito Esteban Hilda classroom management character Mr. Y –content knowledge, classroom management, character, and a
1980-04-15
Purpose of this proceeding is to assess generically the degree of assurance that the radioactive waste can be safely disposed of, to determine when such disposal or off-site storage will be available, and to determine whether wastes can be safely stored on-site past license expiration until off-site disposal/storage is available. (DLC)
The musical representation of Asian characters in the musicals of Richard Rodgers
Ponti, Carla M.
2010-01-01
drones ..characterization. Intervals and drones. Certain intervals,these intervals, accompaniment drones on these intervals (
Boehning, Dankmar
by Mosley et al., which focussed on a cholera outbreak in East Pakistan. To demonstrate the wider range of a cholera outbreak in East Pakistan (East Pakistan was a former province of Pakistan which existed between developments, we will keep the name East Pakistan for the context of this publication since it refers
Roque Sol, Marco A.
2009-06-02
length. Two common tasks in a signal analysis are: Elimination of high frequency noise: this can be done expressing f as a trigonometric series f(t) = a0 + summationdisplay k akcos(kt)+bksin(kt) 4 and then set the high frequency-coefficients ( the ak... and bk for large k ) equal to zero. Second, datacompression: the idea is tosend asignal in away thatit requires minimal data transmission, which can be done by expressing f as a trigonometric series, as above, and then send only those coefficients ak...
Crawford, John R; Deary, Ian J; Starr, John M; Whalley, Lawrence J
2001-01-01
Background. The National Adult Reading Test (NART) is widely used in research and clinical practice as an estimate of pre-morbid or prior ability. However, most of the evidence on the NART's validity as a measure of prior ...
Wang, F.; Yoshida, H.; Matsumoto, K.
2006-01-01
time can be known so that its energy consumption can be estimated accurately. In order to verify the simulation accuracy, an actual room equipped with a gas-engine heat pump (GHP) air-conditioning system is studied by both simulation and measurement...
Self-Tuning PI TCP Flow Controller for AQM Routers With Interval Gain and Phase Margin Assignment
Huang, Changcheng
-tuning proportional-integral (PI) controller for Active Queue Management (AQM) in the Internet. Classical control to achieve good AQM performance while adapting the AQM control system to great traffic load changes very well Queue Management, PI Control, Gain Margin, Phase Margin, Self-Tune 1. INTRODUCTION Congestion control
Plumley, Michael J
2015-01-01
Regulations aimed at improving fuel economy and reducing harmful emissions from internal combustion engines place constraints on lubricant formulations necessary for controlling wear and reducing friction. Viscosity reduction ...
Oliver, Jonathan
2012-10-19
.5 +/- 4.5yrs training) were matched according to baseline characteristics and randomly assigned to a STD or ALT 12 week hypertrophic training protocol. Body composition, strength (1RM bench and squat); power (60% 1RM bench and squat); and vertical jump...
Sawyer, Alexia
2011-05-27
Objectives Using the filled-duration illusion, this study investigated the existence of an independent temporal code operating in working memory. Extending research suggesting the principle distinction between filled- and ...
of Web Users with Rough K-means Pawan Lingras Chad West Abstract Data collection and analysis in web mining faces certain unique challenges. Due to a variety of reasons inherent in web browsing and web techniques in web mining need to accommodate such data. Fuzzy and rough sets provide the ability to deal
Fertin, Guillaume
.rusu}@univ-nantes.fr Abstract--During the last decade, we witnessed the huge impact of the comparative genomics for understanding genomes (from the genome organization to their annotation). However, those genomic approaches genome context. Such limitation may be overcome thanks to recent high-throughput experimental progresses
Schieber, Juergen
) of Tennessee--A combined sedimentologic, petrographic, and geochemical study Yifan Li a,b, , Juergen Schieber b
van Dorp, Johan René
performed in recent years in the maritime transportation domain. These studies have had significant impact Mexico and Canada, 95 percent of foreign trade and 25 percent of domestic trade depends on maritime accidents in maritime transportation. The consequences of these accidents ranged from severe environmental
are performing from an energy efficiency perspective. Buildings that consume less than 95 percent of the energyThe Building Energy Report Card is used to compare the actual annual energy consumption of buildings to a State of Minnesota "target." This target represents the amount of energy that would
More Indian River residents living below poverty level, Census report says
Fernandez, Eduardo
More Indian River residents living below poverty level, Census report says By Keona Gardner Thursday, September 22, 2011 INDIAN RIVER COUNTY -- The county's poverty level is at a three-year high had 14.6 percent of its residents living below the poverty level, compared with 9.5 percent in 2007
February 11, 2009 Conference Report on American Recovery and Reinvestment Act
Savings and Green Jobs o Provides $20 billion in tax incentives for renewable energy and energy efficiency competitive and energy independent, and transforming our economy. · Give 95 percent of American workers an immediate tax cut. · Invest in roads, bridges, mass transit, energy efficient buildings, flood control
November 20th, 2013November 20 , 2013 BC Hydro Today
November 20th, 2013November 20 , 2013 DRAFT 1 #12; BC Hydro Today FY 2013 IPPs in BC BC; BC Hydro serves 95 percent of the population in British Columbia1Columbia Load are split evenly (IPPs are d t t ith BC h d )3under contract with BC hydro)3 Limited transfer capability into BC from
TO PROFITABLE GUAR (Retyped from 1977 Texas Agricultural Experiment Station bulletin)
Mukhtar, Saqib
Harvesting 6 Marketing 6 #12;Economics 7 KEYS TO PROFITABLE GUAR PRODUCTION Leland D. Tripp, Dale A. Lovelace galactomannan gum which forms a viscous gel in cold water. Perhaps the best-known use of guar gum remaining after the extractions of gum contains about 35 percent protein. Of this about 95 percent
HOW DO THEY DO IT DOWN UNDER? New Zealand dairy producers have huge exports and low costs
Radeloff, Volker C.
HOW DO THEY DO IT DOWN UNDER? New Zealand dairy producers have huge exports and low costs. But it is the world's largest dairy exporter, and, unlike the European Union and the United States, New Zealand provides no export subsidies. About 95 percent of New Zealand milk ends up as dairy products consumed
Surface Water Development in Texas.
McNeely, John G.; Lacewell, Ronald D.
1977-01-01
................................. 30 Appendix Tables .......................................... 32 ......... Appendix A: Major Conservation Storage Reservoirs 40 endix B: Water Development Board Policy ............... 41 eferences ............................................... 43... of acre-feet. In Texas, 95 percent of the total conservation storage capacity is concentrated in 63 reservoirs. The Texas Water Development Board has not provided a published figure on average annual yield of surface water from these reservoirs...
How to confirm and exclude different models of material properties in the Casimir effect
V. M. Mostepanenko
2014-11-17
We formulate a method allowing to confirm or exclude the alternative models of material properties at some definite confidence level in experiments on measuring the Casimir force. The method is based on the consideration of differences between the theoretical and mean measured quantities and the confidence intervals for these differences found at sufficiently high or low confidence probabilities. The developed method is applied to the data of four recent experiments on measuring the gradient of the Casimir force by means of a dynamic atomic force microscope. It is shown that in experiments with Au-Au and Ni-Ni test bodies, where the Drude model approach is excluded at a 95% confidence level, the plasma model approach agrees with the data at higher than 90% confidence. In experiments using an Au sphere interacting with either a Ni plate or a graphene-coated substrate the measurement data agree with the common prediction of the Drude and plasma model approaches and theory using the polarization tensor at 90% and 80% confidence levels, respectively.
Solar Census - Perfecting the Art of Automated, Remote Solar Shading Assessments (Fact Sheet)
Not Available
2014-04-01
To validate the work completed by Solar Census as part of the Department of Energy SunShot Incubator 8 award, NREL validated the performanec of the Solar Census Surveyor tool against the industry standard Solmetric SunEye measurements for 4 residential sites in California who experienced light to heavy shading. Using the a two one-sided test (TOST) of statistical equivalence, NREL found that the mean differences between the Solar Census and SunEye mean solar access values for Annual, Summer, and Winter readings fall within the 95% confidence intervals and the confidence intervals themselves fall within the tolerances of +/- 5 SAVs, the Solar Census calculations are statistically equivalent to the SunEye measurements.
Familial site-specific Ovarian cancer is linked to BRCA1 on 17q12-21
Steichen-Gersdorf, E.; Gallion, H.H.; Ponder, M.A.; Pye, C.; Mazoyer, S.; Smith, S.A.; Ponder, B.A.J.; Ford, D.; Easton, D.F.; Girodet, C.
1994-11-01
In a study of nine families with {open_quotes}site-specific{close_quotes} ovarian cancer (criterion: three or more cases of epithelial ovarian cancer and no cases of breast cancer diagnosed at age <50 years) we have obtained evidence of linkage to the breast-ovarian cancer susceptibility gene, BRCA1 on 17q12-21. If the risk of cancer in these families is assumed to be restricted to the ovary, the best estimate of the proportion of families linked to BRCA1 is .78 (95% confidence interval .32-1.0). If predisposition to both breast and ovarian cancer is assumed, the proportion linked is 1.0 (95% confidence interval .46-1.0). The linkage of familial site-specific ovarian cancer to BRCA1 indicates the possibility of predictive testing in such families; however, this is only appropriate in families where the evidence for linkage to BRCA1 is conclusive. 17 refs., 3 figs., 1 tab.
A utility evaluation of nondestructive testing devices used on asphalt concrete pavements
Stoffels, Shelley Marie
1986-01-01
defined . Weighting factors are developed using the Churchman-Ackoff technique. The analysis is performed under uncertainty using a beta probability distribution. The calculations are performed using a computer program. The results are expressed... in terms of an expected value and a 95f confidence interval. Fifteen nondestructive testing devices are evaluated for use for both project-level design and network-level planning on asphalt concrete pavements. These devices are described in detail...
Newsom, Douglas Floyd
1989-01-01
bones. A coefficient of correlation of 0. 82 was calculated with a 99. 9% confidence intervaL ACKNOWLEDGEMENTS The author wishes to exlness his gratitude to cotnittee members Dr. Gerald Schlapper, Dr. John Poston Sr. , and Dr. Dan Hightower... of interest, ctdcium. Another technique, neutxon activation analysis (NAA), can quantify actual calcium content in bone. Neutron activation analysis is discussed in further detail in the literature review and methods sections. Bone is composed of two types...
Global Change Biology (1996)2,169-182 Measurements of carbon sequestration by long-term
Rose, Michael R.
1996-01-01
Global Change Biology (1996)2,169-182 Measurements of carbon sequestration by long-term eddy. The integrated carbon sequestration in 1994 was 2.1 t C ha-l y-l with a 90% confidence interval due to sampling an overall uncertainty on the annual carbon sequestration in 1994 of --0.3to +0.8 t C ha-l y-l. Keywords
Akrami, Yashar; Savage, Christopher; Scott, Pat; Conrad, Jan; Edsjö, Joakim E-mail: savage@fysik.su.se E-mail: conrad@fysik.su.se
2011-07-01
Models of weak-scale supersymmetry offer viable dark matter (DM) candidates. Their parameter spaces are however rather large and complex, such that pinning down the actual parameter values from experimental data can depend strongly on the employed statistical framework and scanning algorithm. In frequentist parameter estimation, a central requirement for properly constructed confidence intervals is that they cover true parameter values, preferably at exactly the stated confidence level when experiments are repeated infinitely many times. Since most widely-used scanning techniques are optimised for Bayesian statistics, one needs to assess their abilities in providing correct confidence intervals in terms of the statistical coverage. Here we investigate this for the Constrained Minimal Supersymmetric Standard Model (CMSSM) when only constrained by data from direct searches for dark matter. We construct confidence intervals from one-dimensional profile likelihoods and study the coverage by generating several pseudo-experiments for a few benchmark sets of pseudo-true parameters. We use nested sampling to scan the parameter space and evaluate the coverage for the benchmarks when either flat or logarithmic priors are imposed on gaugino and scalar mass parameters. The sampling algorithm has been used in the configuration usually adopted for exploration of the Bayesian posterior. We observe both under- and over-coverage, which in some cases vary quite dramatically when benchmarks or priors are modified. We show how most of the variation can be explained as the impact of explicit priors as well as sampling effects, where the latter are indirectly imposed by physicality conditions. For comparison, we also evaluate the coverage for Bayesian credible intervals, and observe significant under-coverage in those cases.
Yashar Akrami; Christopher Savage; Pat Scott; Jan Conrad; Joakim Edsjö
2011-07-13
Models of weak-scale supersymmetry offer viable dark matter (DM) candidates. Their parameter spaces are however rather large and complex, such that pinning down the actual parameter values from experimental data can depend strongly on the employed statistical framework and scanning algorithm. In frequentist parameter estimation, a central requirement for properly constructed confidence intervals is that they cover true parameter values, preferably at exactly the stated confidence level when experiments are repeated infinitely many times. Since most widely-used scanning techniques are optimised for Bayesian statistics, one needs to assess their abilities in providing correct confidence intervals in terms of the statistical coverage. Here we investigate this for the Constrained Minimal Supersymmetric Standard Model (CMSSM) when only constrained by data from direct searches for dark matter. We construct confidence intervals from one-dimensional profile likelihoods and study the coverage by generating several pseudo-experiments for a few benchmark sets of pseudo-true parameters. We use nested sampling to scan the parameter space and evaluate the coverage for the benchmarks when either flat or logarithmic priors are imposed on gaugino and scalar mass parameters. The sampling algorithm has been used in the configuration usually adopted for exploration of the Bayesian posterior. We observe both under- and over-coverage, which in some cases vary quite dramatically when benchmarks or priors are modified. We show how most of the variation can be explained as the impact of explicit priors as well as sampling effects, where the latter are indirectly imposed by physicality conditions. For comparison, we also evaluate the coverage for Bayesian credible intervals, and observe significant under-coverage in those cases.
Unknown
2011-08-17
, THEIR MEAN AND 95X CONFIDENCE INTERVALS FOR GENERATED CONCENTRATIONS . . . 21 TABLE 2. DUNCAN'S MULTIPLE RANGE TEST FOR VARIABLE 6, REFERENCE SCALE LEFT OR RIGHT. . . 30 TABLE 3. TABLE 4. VARIABILITY OF PANELISTS. DUNCAN'S MULTIPLE RANGE TEST.... . 34 . 50 TABLE A. 2. MIRAN CALIBRATION DATA, MEDIUM 1-BUTANOL CONCENTRATIONS' . . . . . ~ . . . . . ~ . 50 TABLE A. 3. MIRAN CALIBRATION DATA, HIGH 1-BUTANOL CONCENTRATIONS' . . . . . . . . ~ - ~ ~ ~ . 52 TABLE B. l. VOLUMETRIC FLOW RATES...
Diabetes and tuberculosis: the impact of the diabetes epidemic on tuberculosis incidence
Stevenson, Catherine R.; Forouhi, Nita G.; Roglic, Gojka; Williams, Brian G.; Lauer, Jeremy A.; Dye, Christopher; Unwin, Nigel
2007-09-06
number not for citation purposes) tion, as well as estimates stratified by age and sex. The published 95% confidence intervals for the relative risks The proportion by which the incidence rate of the out- come of interest (here, incident tuberculosis... of charge "BioMed Central will be the most significant development for disseminating the results of biomedical research in our lifetime." Sir Paul Nurse, Cancer Research UK Your research papers will be: available free of charge to the entire biomedical...
Goulart, Jennifer; Truong, Pauline; Woods, Ryan; Speers, Caroline H.; Kennecke, Hagen; Nichol, Alan
2011-07-01
Purpose: The role of adjuvant postmastectomy radiotherapy (PMRT) remains controversial for the rare presentation of pT3pN0cM0 breast cancer. The present analysis examined locoregional recurrence (LRR) and breast cancer-specific survival (BCSS) in pT2 = 5.0-cm and pT3 >5.0-cm tumors treated with mastectomy, stratified by PMRT use. Materials and Methods: Between January 1, 1989 and December 31, 2000, the British Columbia provincial database yielded 100 node-negative patients with tumors {>=}5 cm of 19,846 nonmetastatic breast cancer patients (0.5%). Of these 100 patients, 44 (44%) had received adjuvant PMRT. Results: The PMRT group contained significantly more pT3 >5-cm cases (p = 0.001) and margin-positive cases (p = .03). With a median follow-up of 10 years, the cumulative 10-year LRR rate was 2.3% (95% confidence interval, 0.2-10.5) in the PMRT group vs. 8.9% (95% confidence interval, 3.2-18.2) in the no-PMRT group (p = .2). Regarding LRR in the no-PMRT group, all patients had Grade 3 histologic features (LRR 17%, 5 of 29) and had not received hormonal therapy (LRR 15%, 5 of 34). The 10-year breast cancer-specific survival rate was 85.8% (95% confidence interval 71.0-93.4) in the PMRT group vs. 74.6% (95% confidence interval 59.9-84.5) in the no-PMRT group (p = .2). On multivariate analysis, adjusted for the prognostic and predictive variables, PMRT did not significantly improve the LRR or breast cancer-specific survival rates. Conclusion: The present study demonstrated a low LRR rate for node-negative breast cancer {>=}5 cm. Our results indicate that PMRT should be considered for Grade 3 histologic features and patients not undergoing hormonal therapy.
Frequentist limit setting in effective field theories
Gregersen, Kristian Damlund
2015-01-01
The original frequentist approach for computing confidence intervals involves the construction of the confidence belt which provides a mapping between the true value of the parameter and its maximum likelihood estimator. Alternative methods based on the frequentist idea exist, including the delta likelihood method, the $CL_s$ method and a method here referred to as the $p$-value method, which have all been commonly used in high energy experiments. The purpose of this article is to draw attention to a series of potential problems when applying these alternative methods to the important case where the predicted signal depends quadratically on the parameter of interest, a situation which is common in high energy physics as it covers scenarios encountered in effective theories. These include anomalous Higgs couplings and anomalous trilinear and quartic gauge couplings. It is found that the alternative methods, contrary to the original method using the confidence belt, in general do not manage to correctly describ...
The myth of science-based predictive modeling.
Hemez, F. M. (François M.)
2004-01-01
A key aspect of science-based predictive modeling is the assessment of prediction credibility. This publication argues that the credibility of a family of models and their predictions must combine three components: (1) the fidelity of predictions to test data; (2) the robustness of predictions to variability, uncertainty, and lack-of-knowledge; and (3) the prediction accuracy of models in cases where measurements are not available. Unfortunately, these three objectives are antagonistic. A recently published Theorem that demonstrates the irrevocable trade-offs between fidelity-to-data, robustness-to-uncertainty, and confidence in prediction is summarized. High-fidelity models cannot be made increasingly robust to uncertainty and lack-of-knowledge. Similarly, robustness-to-uncertainty can only be improved at the cost of reducing the confidence in prediction. The concept of confidence in prediction relies on a metric for total uncertainty, capable of aggregating different representations of uncertainty (probabilistic or not). The discussion is illustrated with an engineering application where a family of models is developed to predict the acceleration levels obtained when impacts of varying levels propagate through layers of crushable hyper-foam material of varying thicknesses. Convex modeling is invoked to represent a severe lack-of-knowledge about the constitutive material behavior. The analysis produces intervals of performance metrics from which the total uncertainty and confidence levels are estimated. Finally, performance, robustness and confidence are extrapolated throughout the validation domain to assess the predictive power of the family of models away from tested configurations.
Real Time Grid Reliability Management 2005
Eto, Joe
2008-01-01
case, confidence in grid security will increase. Confidencecase, confidence in grid security will increase. Confidencecase, confidence in grid security will increase. Confidence
Texas Bullnettle and its Control.
Johnson, P. R.
1966-01-01
Acknowledgments -------------------------------------------------------.---------------------- 11 A solution of 0.1 per- cent 2,4-D amine in water or combined with 0.1 per- cent picloram and 0.25 percent surfactant killed 85 to 95 percent of the complete... nearer the crown. In outward appear- ance and in cross section, branch roots are similar to the main tuber. The small feeder roots are light brown in color, threadlike and are brittle. Usually, only short sections of feeder roots can 4. be removed...
Versatile P(acman) BAC Libraries for Transgenesis Studies in Drosophila melanogaster
Venken, Koen J.T.; Carlson, Joseph W.; Schulze, Karen L.; Pan, Hongling; He, Yuchun; Spokony, Rebecca; Wan, Kenneth H.; Koriabine, Maxim; de Jong, Pieter J.; White, Kevin P.; Bellen, Hugo J.; Hoskins, Roger A.
2009-04-21
We constructed Drosophila melanogaster BAC libraries with 21-kb and 83-kb inserts in the P(acman) system. Clones representing 12-fold coverage and encompassing more than 95percent of annotated genes were mapped onto the reference genome. These clones can be integrated into predetermined attP sites in the genome using Phi C31 integrase to rescue mutations. They can be modified through recombineering, for example to incorporate protein tags and assess expression patterns.
Neural network predictions for Z' boson within LEP2 data set of Bhabha process
A. N. Buryk; V. V. Skalozub
2009-05-15
The neural network approach is applied to search for the Z'-boson within the LEP2 data set for e+ e- -> e+ e- scattering process. In the course of the analysis, the data set is reduced by 20 percent. The axial-vector and vector couplings of the Z' are estimated at 95 percent CL within a two-parameter fit. The mass is determined to be 0.53-1.05 TeV. Comparisons with other results are given.
Impacts of stripmining lignite on net returns for agricultural enterprises in East Texas
Morris, Christina
1984-01-01
mining. Eighty to 95 percent of the lignite is recovezed. Mining is currently only done to depths of 120 feet; greater than 150 feet is not yet economically feasible (Kaiser et al. 1980). During mining, draglines remove the earth above the lignite... (multiseam) to be mined. The same equipment, draglines and/or scrapers are used to replace and contour the overburden after the lignite is removed. Once the overburden is replaced and contoured the land is revegetated using bermudagrass in warm weather...
Local Resource Management Institutions: A Case Study on Sokshing Management
Wangchuk, Sangay
2001-01-01
in food production. This intervention also had a direct impact on the local water management institution. Although the membership pattern is similar to the informal one, the inclusion of new members is formalised and legitimised through the axiom... than 95 percent of the individuals interviewed said that the practice of Reedum is good for the whole community. In other words agricultural crops are protected from natural calamities such as floods, storms and insect epidemics by the deities residing...
Gershenzon, Naum I; Ritzi, Robert W; Dominic, David F
2014-01-01
The Victor Unit of the Ivishak Formation in the Prudhoe Bay Oilfield is characterized by high net-to-gross fluvial sandstones and conglomerates. The highest permeability is found within sets of cross-strata of open-framework conglomerate (OFC). They are preserved within unit bar deposits and assemblages of unit bar deposits within compound (braid) bar deposits. They are thief zones limiting enhanced oil recovery. We incorporate recent research that has quantified important attributes of their sedimentary architecture within preserved deposits. We use high-resolution models to demonstrate the fundamental aspects of their control on oil production rate, water breakthrough time, and spatial and temporal distribution of residual oil saturation. We found that when the pressure gradient is oriented perpendicular to the paleoflow direction, the total oil production and the water breakthrough time are larger, and remaining oil saturation is smaller, than when it is oriented parallel to paleoflow. The pressure differe...
Maier, Robert S.
and negative magnetization. In the weakÂnoise limit, noiseÂactivated magnetization reversals become exponentially rare, the reversal rate being given by the Kramers formula \\Gamma Â¸ \\Gamma 0 exp], who worked out a `large deviation theory' of its magnetization reversals, but did not compute
Arzuman, Sadun
2004-09-30
In this study, the structure, depositional system, and the seismic stratigraphy of the VLE 196 area, Block V in Lamar Field were interpreted using 3-D seismic data and well logs to characterize structural and depositional settings of the Guasare...
2015-01-01
stimulation and norepinephrine infusion. LV endocardial andOf note, norepinephrine infusion did not increase DOR or Tp-to continuous intravenous infusion of ?-chloralose (10 mg/
Guennou, L.; Adami, C.; Ulmer, M.P.; LeBrun, V.; Durret, F.; Johnston, D.; Ilbert, O.; Clowe, D.; Gavazzi, R.; Murphy, K.; Schrabback, T.; /Leiden Observ. /Fermilab
2010-08-01
As a contribution to the understanding of the dark energy concept, the Dark energy American French Team (DAFT, in French FADA) has started a large project to characterize statistically high redshift galaxy clusters, infer cosmological constraints from Weak Lensing Tomography, and understand biases relevant for constraining dark energy and cluster physics in future cluster and cosmological experiments. Aims. The purpose of this paper is to establish the basis of reference for the photo-z determination used in all our subsequent papers, including weak lensing tomography studies. This project is based on a sample of 91 high redshift (z {ge} 0.4), massive ({approx}> 3 x 10{sup 14} M{sub {circle_dot}}) clusters with existing HST imaging, for which we are presently performing complementary multi-wavelength imaging. This allows us in particular to estimate spectral types and determine accurate photometric redshifts for galaxies along the lines of sight to the first ten clusters for which all the required data are available down to a limit of I{sub AB} = 24./24.5 with the LePhare software. The accuracy in redshift is of the order of 0.05 for the range 0.2 {le} z {le} 1.5. We verified that the technique applied to obtain photometric redshifts works well by comparing our results to with previous works. In clusters, photo-z accuracy is degraded for bright absolute magnitudes and for the latest and earliest type galaxies. The photo-z accuracy also only slightly varies as a function of the spectral type for field galaxies. As a consequence, we find evidence for an environmental dependence of the photo-z accuracy, interpreted as the standard used Spectral Energy Distributions being not very well suited to cluster galaxies. Finally, we modeled the LCDCS 0504 mass with the strong arcs detected along this line of sight.
David B. Wood
2009-10-08
Between 1951 and 1992, underground nuclear weapons testing was conducted at 828 sites on the Nevada Test Site, Nye County, Nevada. Prior to and following these nuclear tests, holes were drilled and mined to collect rock samples. These samples are organized and stored by depth of borehole or drift at the U.S. Geological Survey Core Library and Data Center at Mercury, Nevada, on the Nevada Test Site. From these rock samples, rock properties were analyzed and interpreted and compiled into project files and in published reports that are maintained at the Core Library and at the U.S. Geological Survey office in Henderson, Nevada. These rock-sample data include lithologic descriptions, physical and mechanical properties, and fracture characteristics. Hydraulic properties also were compiled from holes completed in the water table. Rock samples are irreplaceable because pre-test, in-place conditions cannot be recreated and samples cannot be recollected from the many holes destroyed by testing. Documenting these data in a published report will ensure availability for future investigators.
Bernstein, Dennis S.
examples of systems with inequality-constrained states arise in aeronautics [9]. The equality and an inequality-constrained state vector are mutually exclusive assumptions even for linear systems [5, 6 algorithms have been developed for inequality-constrained linear state estimation. One of the most popular
Cole, R.D.; Nelson, W.J. )
1993-03-01
The Mississippian Ste. Genevieve and Paoli Limestones and sandstones of the Aux Vases Formation are lateral facies of one another. This interpretation is based on comprehensive investigations of outcrops, and selected cores, samples of well cuttings, and geophysical logs conducted over a period of four years. Both units exhibit similar sedimentological characteristics and represent open marine, shallow subtidal, and intertidal environments. The presence of low-angle cross-laminae, ripple- and plane-laminae, climbing ripples, and ooid shoals suggest most deposition occurred under low energy conditions. Lenticular, channel-like scour and fill structures that contain both fine-grained quartz sand and abraded, disarticulated fossil fragments indicate localized higher energy deposition. The authors studies indicate that siliciclastic vs. carbonate deposition was controlled strictly by available sediment, and not by regressive (siliciclastic) and transgressive (carbonate) events, as inferred by previous workers. This conclusion is based on lateral facies relationships, and the supplanting of carbonates by clastics occurring in the upper part of the Ste. Genevieve through the middle part of the Paoli. The Aux Vases is thickest, coarsest, and least mature in the northwestern part of the Illinois Basin, and pinches out to the southeast. This implies a northwesterly source for clastics, perhaps the Transcontinental Arch. After early Chesterian time, the Transcontinental Arch apparently supplied little or no sediment to any flanking basin. The Ste. Genevieve, Paoli, and Aux Vases are major oil-producing units in the Illinois Basin. New understanding of regional relationships should enhance exploratory success and improve recovery from established fields.
Timing of Radiotherapy and Outcome in Patients Receiving Adjuvant Endocrine Therapy
Karlsson, Per, E-mail: per.karlsson@oncology.gu.s [Department of Oncology, Sahlgrenska University Hospital, Gothenburg (Sweden); Cole, Bernard F. [Department of Mathematics and Statistics, University of Vermont College of Engineering and Mathematical Sciences, Burlington, VT (United States); International Breast Cancer Study Group Statistical Center, Department of Biostatistics and Computational Biology, Dana-Farber Cancer Institute, Boston, MA (United States); Colleoni, Marco [Department of Medicine, Research Unit in Medical Senology, European Institute of Oncology, Milan (Italy); Roncadin, Mario [Department of Radiotherapy, Centro di Riferimento Oncologico, Aviano (Italy); Chua, Boon H. [Department of Radiation Oncology, Peter MacCallum Cancer Centre, Melbourne (Australia); Murray, Elizabeth [Department of Radiation Oncology, Groote Shuur Hospital and University of Cape Town, Cape Town (South Africa); Price, Karen N. [International Breast Cancer Study Group Statistical Center, Frontier Science and Technology Research Foundation, Boston, MA (United States); Castiglione-Gertsch, Monica [International Breast Cancer Study Group Coordinating Center, Bern (Switzerland); Goldhirsch, Aron [European Institute of Oncology, Milan (Italy); Oncology Institute of Southern Switzerland, Bellinzona (Switzerland); Gruber, Guenther [Institut fuer Radiotherapie, Klinik Hirslanden, Zuerich (Switzerland)
2011-06-01
Purpose: To evaluate the association between the interval from breast-conserving surgery (BCS) to radiotherapy (RT) and the clinical outcome among patients treated with adjuvant endocrine therapy. Patients and Methods: Patient information was obtained from three International Breast Cancer Study Group trials. The analysis was restricted to 964 patients treated with BCS and adjuvant endocrine therapy. The patients were divided into two groups according to the median number of days between BCS and RT and into four groups according to the quartile of time between BCS and RT. The endpoints were the interval to local recurrence, disease-free survival, and overall survival. Proportional hazards regression analysis was used to perform comparisons after adjustment for baseline factors. Results: The median interval between BCS and RT was 77 days. RT timing was significantly associated with age, menopausal status, and estrogen receptor status. After adjustment for these factors, no significant effect of a RT delay {<=}20 weeks was found. The adjusted hazard ratio for RT within 77 days vs. after 77 days was 0.94 (95% confidence interval [CI], 0.47-1.87) for the interval to local recurrence, 1.05 (95% CI, 0.82-1.34) for disease-free survival, and 1.07 (95% CI, 0.77-1.49) for overall survival. For the interval to local recurrence the adjusted hazard ratio for {<=}48, 49-77, and 78-112 days was 0.90 (95% CI, 0.34-2.37), 0.86 (95% CI, 0.33-2.25), and 0.89 (95% CI, 0.33-2.41), respectively, relative to {>=}113 days. Conclusion: A RT delay of {<=}20 weeks was significantly associated with baseline factors such as age, menopausal status, and estrogen-receptor status. After adjustment for these factors, the timing of RT was not significantly associated with the interval to local recurrence, disease-free survival, or overall survival.
RADIATION PRESSURE DETECTION AND DENSITY ESTIMATE FOR 2011 MD
Micheli, Marco; Tholen, David J.; Elliott, Garrett T. E-mail: tholen@ifa.hawaii.edu
2014-06-10
We present our astrometric observations of the small near-Earth object 2011 MD (H ? 28.0), obtained after its very close fly-by to Earth in 2011 June. Our set of observations extends the observational arc to 73 days, and, together with the published astrometry obtained around the Earth fly-by, allows a direct detection of the effect of radiation pressure on the object, with a confidence of 5?. The detection can be used to put constraints on the density of the object, pointing to either an unexpectedly low value of ?=(640±330)kg m{sup ?3} (68% confidence interval) if we assume a typical probability distribution for the unknown albedo, or to an unusually high reflectivity of its surface. This result may have important implications both in terms of impact hazard from small objects and in light of a possible retrieval of this target.
Comment on the Word 'Cooling' as it is Used in Beam Physics
Sessler, Andrew M.
2005-09-10
The Institute of Medicine (IOM) of the National Academy of Sciences recently completed a critical review of the scientific literature pertaining to the association of indoor dampness and mold contamination with adverse health effects. In this paper, we report the results of quantitative meta-analysis of the studies reviewed in the IOM report. We developed point estimates and confidence intervals (CIs) to summarize the association of several respiratory and asthma-related health outcomes with the presence of dampness and mold in homes. The odds ratios and confidence intervals from the original studies were transformed to the log scale and random effect models were applied to the log odds ratios and their variance. Models were constructed both accounting for the correlation between multiple results within the studies analyzed and ignoring such potential correlation. Central estimates of ORs for the health outcomes ranged from 1.32 to 2.10, with most central estimates between 1.3 and 1.8. Confidence intervals (95%) excluded unity except in two of 28 instances, and in most cases the lower bound of the CI exceeded 1.2. In general, the two meta-analysis methods produced similar estimates for ORs and CIs. Based on the results of the meta-analyses, building dampness and mold are associated with approximately 30% to 80% increases in a variety of respiratory and asthma-related health outcomes. The results of these meta-analyses reinforce the IOM's recommendation that actions be taken to prevent and reduce building dampness problems.
Meta-Analyses of the Associations of Respiratory Health Effectswith Dampness and Mold in Homes
Fisk, William J.; Lei-Gomez, Quanhong; Mendell, Mark J.
2006-01-01
The Institute of Medicine (IOM) of the National Academy of Sciences recently completed a critical review of the scientific literature pertaining to the association of indoor dampness and mold contamination with adverse health effects. In this paper, we report the results of quantitative meta-analysis of the studies reviewed in the IOM report. We developed point estimates and confidence intervals (CIs) to summarize the association of several respiratory and asthma-related health outcomes with the presence of dampness and mold in homes. The odds ratios and confidence intervals from the original studies were transformed to the log scale and random effect models were applied to the log odds ratios and their variance. Models were constructed both accounting for the correlation between multiple results within the studies analyzed and ignoring such potential correlation. Central estimates of ORs for the health outcomes ranged from 1.32 to 2.10, with most central estimates between 1.3 and 1.8. Confidence intervals (95%) excluded unity except in two of 28 instances, and in most cases the lower bound of the CI exceeded 1.2. In general, the two meta-analysis methods produced similar estimates for ORs and CIs. Based on the results of the meta-analyses, building dampness and mold are associated with approximately 30% to 80% increases in a variety of respiratory and asthma-related health outcomes. The results of these meta-analyses reinforce the IOM's recommendation that actions be taken to prevent and reduce building dampness problems.
Fast detection of nonlinearity and nonstationarity in short and noisy time series
M. De Domenico; V. Latora
2010-07-07
We introduce a statistical method to detect nonlinearity and nonstationarity in time series, that works even for short sequences and in presence of noise. The method has a discrimination power similar to that of the most advanced estimators on the market, yet it depends only on one parameter, is easier to implement and faster. Applications to real data sets reject the null hypothesis of an underlying stationary linear stochastic process with a higher confidence interval than the best known nonlinear discriminators up to date.
The Two-Point Correlation Function of Gamma-ray Bursts
Li, Ming-Hua
2015-01-01
In this paper, we examine the spacial distribution of gamma-ray bursts (GRBs) using a sample of 373 objects. We subdivide the GRB data into two redshift intervals over the redshift range $0gamma}$ to the measured $\\xi(r)$ and obtain an amplitude and slope of $r_0= 1235.2 \\pm 342.6~h^{-1}$ Mpc and $\\gamma = 0.80\\pm 0.19 $ ($1\\sigma$ confidence level) over the scales $r=200$ to $10^4~h^{-1}$ Mpc. Our ...
A new method for measurement of safety rod drop times
Pesic, M.; Stefanovic, D. ); Marinkovic, P. )
1992-10-01
In this paper, a new method for the accurate measurement of safety rod drop times is proposed. It is based on a fast electromagnetic transducer and an analog-to-digital converter (ADC) conected to a computer system. Evaluation of recorded data is conducted by a developed computer code. The first measurements performed at the HERBE fast-thermal RB reactor show that a relative uncertainty (confidence level 95%) of less than 6% can be achieved in determination of rod drop time (with time intervals ranging from 0.4-10.0 s). Further improvements in accuracy are possible.
Schrijvers, Michiel L. [Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Department of Pathology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Pattje, Wouter J. [Department of Pathology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Department of Radiation Oncology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Slagter-Menkema, Lorian [Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Department of Pathology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Mastik, Mirjam F.; Gibcus, Johan H. [Department of Pathology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Langendijk, Johannes A. [Department of Radiation Oncology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Wal, Jacqueline E. van der [Department of Pathology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Laan, Bernard F.A.M. vn der [Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen (Netherlands); Schuuring, E., E-mail: e.schuuring@umcg.nl [Department of Pathology, University Medical Center Groningen, University of Groningen, Groningen (Netherlands)
2012-07-15
Purpose: We recently reported on the identification of the Fas-associated death domain (FADD) as a possible driver of the chromosome 11q13 amplicon and the association between increased FADD expression and disease-specific survival in advanced-stage laryngeal carcinoma. The aim of this study was to examine whether expression of FADD and its Ser194-phosphorylated isoform (pFADD) predicts local control in patients with early-stage glottic carcinoma primarily treated with radiotherapy only. Methods and Materials: Immunohistochemical staining for FADD and pFADD was performed on pretreatment biopsy specimens of 92 patients with T1-T2 glottic squamous cell carcinoma primarily treated with radiotherapy between 1996 and 2005. Cox regression analysis was used to correlate expression levels with local control. Results: High levels of pFADD were associated with significantly better local control (hazard ratio, 2.40; 95% confidence interval, 1.04-5.55; p = 0.040). FADD overexpression showed a trend toward better local control (hazard ratio, 3.656; 95% confidence interval, 0.853-15.663; p = 0.081). Multivariate Cox regression analysis showed that high pFADD expression was the best predictor of local control after radiotherapy. Conclusions: This study showed that expression of phosphorylated FADD is a new prognostic biomarker for better local control after radiotherapy in patients with early-stage glottic carcinomas.
Kuhan, Ganesh, E-mail: gkuhan@nhs.net; Abisi, Said; Braithwaite, Bruce D.; MacSweeney, Shane T. R. [Nottingham University Hospitals, Vascular and Endovascular Unit, Queens Medical Centre (United Kingdom); Whitaker, Simon C.; Habib, Said B. [Nottingham University Hospitals, Department of Radiology, Queen's Medical Centre (United Kingdom)
2012-10-15
Purpose: To evaluate early patency rate of the heparin-bonded stent grafts in atherosclerotic long femoropopliteal occlusive disease, and to identify factors that affect outcome. Methods: Heparin-bonded Viabahn stent grafts were placed in 33 limbs in 33 patients during 2009-2010. The stents were deployed to rescue failed conventional balloon angioplasty. Mean age was 69 (range 44-88) years, and 67 % (22 of 33) were men. Most procedures (21 of 33, 64 %) were performed for critical limb ischemia (33 % for rest pain, 30 % tissue loss). Kaplan-Meier plots and Cox regression analysis were used to identify significant risk factors. Results: The average length of lesions treated was 25 {+-} 10 cm, and they were predominantly TASC (Transatlantic Intersociety Consensus) D (n = 13) and C (n = 17) lesions. The median primary patency was 5.0 months (95 % confidence interval 1.22-8.77). The mean secondary patency was 8.6 months (95 % confidence interval 6.82-10.42). Subsequently, 4 patients underwent bypass surgery and 5 patients underwent major amputation. One patient died. There were 5 in-stent or edge-stent stenoses. Cox multivariate regression analysis identified TASC D lesions to be a significant risk factor for early occlusion (p = 0.035). Conclusion: TASC D lesions of femoropopliteal occlusions have poor patency rates with the use of heparin-bonded stent grafts after failed conventional angioplasty. Alternative options should be considered for these patients.
Abe, K; Aihara, H; Akiri, T; Andreopoulos, C; Aoki, S; Ariga, A; Assylbekov, S; Autiero, D; Barbi, M; Barker, G J; Barr, G; Bartet-Friburg, P; Bass, M; Batkiewicz, M; Bay, F; Berardi, V; Berger, B E; Berkman, S; Bhadra, S; Blaszczyk, F d M; Blondel, A; Bolognesi, S; Bordoni, S; Boyd, S B; Brailsford, D; Bravar, A; Bronner, C; Buchanan, N; Calland, R G; Rodríguez, J Caravaca; Cartwright, S L; Castillo, R; Catanesi, M G; Cervera, A; Cherdack, D; Chikuma, N; Christodoulou, G; Clifton, A; Coleman, J; Coleman, S J; Collazuol, G; Connolly, K; Cremonesi, L; Dabrowska, A; Danko, I; Das, R; Davis, S; de Perio, P; De Rosa, G; Dealtry, T; Dennis, S R; Densham, C; Dewhurst, D; Di Lodovico, F; Di Luise, S; Dolan, S; Drapier, O; Duboyski, T; Duffy, K; Dumarchez, J; Dytman, S; Dziewiecki, M; Emery-Schrenk, S; Ereditato, A; Escudero, L; Feusels, T; Finch, A J; Fiorentini, G A; Friend, M; Fujii, Y; Fukuda, Y; Furmanski, A P; Galymov, V; Garcia, A; Giffin, S; Giganti, C; Gilje, K; Goeldi, D; Golan, T; Gonin, M; Grant, N; Gudin, D; Hadley, D R; Haegel, L; Haesler, A; Haigh, M D; Hamilton, P; Hansen, D; Hara, T; Hartz, M; Hasegawa, T; Hastings, N C; Hayashino, T; Hayato, Y; Hearty, C; Helmer, R L; Hierholzer, M; Hignight, J; Hillairet, A; Himmel, A; Hiraki, T; Hirota, S; Holeczek, J; Horikawa, S; Hosomi, F; Huang, K; Ichikawa, A K; Ieki, K; Ieva, M; Ikeda, M; Imber, J; Insler, J; Irvine, T J; Ishida, T; Ishii, T; Iwai, E; Iwamoto, K; Iyogi, K; Izmaylov, A; Jacob, A; Jamieson, B; Jiang, M; Johnson, S; Jo, J H; Jonsson, P; Jung, C K; Kabirnezhad, M; Kaboth, A C; Kajita, T; Kakuno, H; Kameda, J; Kanazawa, Y; Karlen, D; Karpikov, I; Katori, T; Kearns, E; Khabibullin, M; Khotjantsev, A; Kielczewska, D; Kikawa, T; Kilinski, A; Kim, J; King, S; Kisiel, J; Kitching, P; Kobayashi, T; Koch, L; Koga, T; Kolaceke, A; Konaka, A; Kormos, L L; Korzenev, A; Koshio, Y; Kropp, W; Kubo, H; Kudenko, Y; Kurjata, R; Kutter, T; Lagoda, J; Lamont, I; Larkin, E; Laveder, M; Lawe, M; Lazos, M; Lindner, T; Lister, C; Litchfield, R P; Longhin, A; Lopez, J P; Ludovici, L; Magaletti, L; Mahn, K; Malek, M; Manly, S; Marino, A D; Marteau, J; Martin, J F; Martins, P; Martynenko, S; Maruyama, T; Matveev, V; Mavrokoridis, K; Mazzucato, E; McCarthy, M; McCauley, N; McFarland, K S; McGrew, C; Mefodiev, A; Metelko, C; Mezzetto, M; Mijakowski, P; Miller, C A; Minamino, A; Mineev, O; Missert, A; Miura, M; Moriyama, S; Mueller, Th A; Murakami, A; Murdoch, M; Murphy, S; Myslik, J; Nakadaira, T; Nakahata, M; Nakamura, K G; Nakamura, K; Nakayama, S; Nakaya, T; Nakayoshi, K; Nantais, C; Nielsen, C; Nirkko, M; Nishikawa, K; Nishimura, Y; Nowak, J; O'Keeffe, H M; Ohta, R; Okumura, K; Okusawa, T; Oryszczak, W; Oser, S M; Ovsyannikova, T; Owen, R A; Oyama, Y; Palladino, V; Palomino, J L; Paolone, V; Payne, D; Perevozchikov, O; Perkin, J D; Petrov, Y; Pickard, L; Guerra, E S Pinzon; Pistillo, C; Plonski, P; Poplawska, E; Popov, B; Posiadala-Zezula, M; Poutissou, J -M; Poutissou, R; Przewlocki, P; Quilain, B; Radicioni, E; Ratoff, P N; Ravonel, M; Rayner, M A M; Redij, A; Reeves, M; Reinherz-Aronis, E; Riccio, C; Rodrigues, P A; Rojas, P; Rondio, E; Roth, S; Rubbia, A; Ruterbories, D; Rychter, A; Sacco, R; Sakashita, K; Sánchez, F; Sato, F; Scantamburlo, E; Scholberg, K; Schoppmann, S; Schwehr, J; Scott, M; Seiya, Y; Sekiguchi, T; Sekiya, H; Sgalaberna, D; Shah, R; Shaker, F; Shaw, D; Shiozawa, M; Short, S; Shustrov, Y; Sinclair, P; Smith, B; Smy, M; Sobczyk, J T; Sobel, H; Sorel, M; Southwell, L; Stamoulis, P; Steinmann, J; Still, B; Suda, Y; Suzuki, A; Suzuki, K; Suzuki, S Y; Suzuki, Y; Tacik, R; Tada, M; Takahashi, S; Takeda, A; Takeuchi, Y; Tanaka, H K; Tanaka, H A; Tanaka, M M; Terhorst, D; Terri, R; Thompson, L F; Thorley, A; Tobayama, S; Toki, W; Tomura, T; Totsuka, Y; Touramanis, C; Tsukamoto, T; Tzanov, M; Uchida, Y; Vacheret, A; Vagins, M; Vasseur, G; Wachala, T; Wakamatsu, K; Walter, C W; Wark, D; Warzycha, W; Wascko, M O; Weber, A; Wendell, R; Wilkes, R J; Wilking, M J; Wilkinson, C; Williamson, Z; Wilson, J R; Wilson, R J; Wongjirad, T; Yamada, Y; Yamamoto, K; Yanagisawa, C; Yano, T; Yen, S; Yershov, N; Yokoyama, M; Yoshida, K; Yuan, T; Yu, M; Zalewska, A; Zalipska, J; Zambelli, L; Zaremba, K; Ziembicki, M; Zimmerman, E D; Zito, M; ?muda, J
2015-01-01
We report on measurements of neutrino oscillation using data from the T2K long-baseline neutrino experiment collected between 2010 and 2013. In an analysis of muon neutrino disappearance alone, we find the following estimates and 68% confidence intervals for the two possible mass hierarchies: Normal Hierarchy: $\\sin^2\\theta_{23}=0.514^{+0.055}_{-0.056}$ and $\\Delta m^2_{32}=(2.51\\pm0.10)\\times 10^{-3}$ eV$^2$/c$^4$ Inverted Hierarchy: $\\sin^2\\theta_{23}=0.511\\pm0.055$ and $\\Delta m^2_{13}=(2.48\\pm0.10)\\times 10^{-3}$ eV$^2$/c$^4$ The analysis accounts for multi-nucleon mechanisms in neutrino interactions which were found to introduce negligible bias. We describe our first analyses that combine measurements of muon neutrino disappearance and electron neutrino appearance to estimate four oscillation parameters and the mass hierarchy. Frequentist and Bayesian intervals are presented for combinations of these parameters, with and without including recent reactor measurements. At 90% confidence level and including...
Stereotactic Radiosurgery for Acoustic Neuromas: What Happens Long Term?
Roos, Daniel E., E-mail: daniel.roos@health.sa.gov.au [Department of Radiation Oncology, Royal Adelaide Hospital, Adelaide, South Australia (Australia); University of Adelaide School of Medicine, Adelaide, South Australia (Australia); Potter, Andrew E. [Department of Radiation Oncology, Royal Adelaide Hospital, Adelaide, South Australia (Australia); Brophy, Brian P. [Department of Neurosurgery, Royal Adelaide Hospital, Adelaide, South Australia (Australia); University of Adelaide School of Medicine, Adelaide, South Australia (Australia)
2012-03-15
Purpose: To determine the clinical outcomes for acoustic neuroma treated with low-dose linear accelerator stereotactic radiosurgery (SRS) >10 years earlier at the Royal Adelaide Hospital using data collected prospectively at a dedicated SRS clinic. Methods and Materials: Between November 1993 and December 2000, 51 patients underwent SRS for acoustic neuroma. For the 44 patients with primary SRS for sporadic (unilateral) lesions, the median age was 63 years, the median of the maximal tumor diameter was 21 mm (range, 11-34), and the marginal dose was 14 Gy for the first 4 patients and 12 Gy for the other 40. Results: The crude tumor control rate was 97.7% (1 patient required salvage surgery for progression at 9.75 years). Only 8 (29%) of 28 patients ultimately retained useful hearing (interaural pure tone average {<=}50 dB). Also, although the Kaplan-Meier estimated rate of hearing preservation at 5 years was 57% (95% confidence interval, 38-74%), this decreased to 24% (95% confidence interval, 11-44%) at 10 years. New or worsened V and VII cranial neuropathy occurred in 11% and 2% of patients, respectively; all cases were transient. No case of radiation oncogenesis developed. Conclusions: The long-term follow-up data of low-dose (12-14 Gy) linear accelerator SRS for acoustic neuroma have confirmed excellent tumor control and acceptable cranial neuropathy rates but a continual decrease in hearing preservation out to {>=}10 years.
Makarov, Yuri V.; Etingov, Pavel V.; Huang, Zhenyu; Ma, Jian; Subbarao, Krishnappa
2010-10-19
In this paper, an approach to evaluate the uncertainties of the balancing capacity, ramping capability, and ramp duration requirements is proposed. The approach includes three steps: forecast data acquisition, statistical analysis of retrospective information, and prediction of grid balancing requirements for a specified time horizon and a given confidence level. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on histogram analysis, incorporating sources of uncertainty of both continuous (wind and load forecast errors) and discrete (forced generator outages and start-up failures) nature. A new method called the "flying-brick" technique is developed to evaluate the look-ahead required generation performance envelope for the worst case scenario within a user-specified confidence level. A self-validation process is used to validate the accuracy of the confidence intervals. To demonstrate the validity of the developed uncertainty assessment methods and its impact on grid operation, a framework for integrating the proposed methods with an EMS system is developed. Demonstration through integration with an EMS system illustrates the applicability of the proposed methodology and the developed tool for actual grid operation and paves the road for integration with EMS systems from other vendors.
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Huang, Zhenyu; Subbarao, Krishnappa
2011-06-23
An approach to evaluate the uncertainties of the balancing capacity, ramping capability, and ramp duration requirements is proposed. The approach includes three steps: forecast data acquisition, statistical analysis of retrospective information, and prediction of grid balancing requirements for a specified time horizon and a given confidence level. An assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on histogram analysis, incorporating sources of uncertainty - both continuous (wind and load forecast errors) and discrete (forced generator outages and start-up failures). A new method called the 'flying-brick' technique is developed to evaluate the look-ahead required generation performance envelope for the worst case scenario within a user-specified confidence level. A self-validation process is used to validate the accuracy of the confidence intervals. To demonstrate the validity of the developed uncertainty assessment methods and its impact on grid operation, a framework for integrating the proposed methods with an EMS system is developed. Demonstration through EMS integration illustrates the applicability of the proposed methodology and the developed tool for actual grid operation and paves the road for integration with EMS systems in control rooms.
Frequentist limit setting in effective field theories
Kristian Damlund Gregersen; Jørgen Beck Hansen
2015-09-09
The original frequentist approach for computing confidence intervals involves the construction of the confidence belt which provides a mapping between the true value of the parameter and its maximum likelihood estimator. Alternative methods based on the frequentist idea exist, including the delta likelihood method, the $CL_s$ method and a method here referred to as the $p$-value method, which have all been commonly used in high energy experiments. The purpose of this article is to draw attention to a series of potential problems when applying these alternative methods to the important case where the predicted signal depends quadratically on the parameter of interest, a situation which is common in high energy physics as it covers scenarios encountered in effective theories. These include anomalous Higgs couplings and anomalous trilinear and quartic gauge couplings. It is found that the alternative methods, contrary to the original method using the confidence belt, in general do not manage to correctly describe the relationship between the parameter of interest and its maximum likelihood estimator, and potentially over-constrain the parameter.
2009-01-01
West Texas or South Texas; 67,000 bales came from the East Texas-Oklahoma territory; and 5,000 bales came from North Texas. An additional 27,000 bales of short staple cotton were obtained from the Central Belt. Mills surveyed also consumed about 28... of this report, the states of Texas and Oklahoma are considered to be the southwestern area. Cotton produced in this area has been predominantly of short staple length. These two states have pro~duced more than 95 percent of all U. S. cotton under 1 inch...
An investigation of the effect of ammonia and amines on the recovery of oil
Richardson, James Malone
1958-01-01
, diethylamine, or normal butylamine in an air drive prior to waterflooding re- duces the amount of oil recovered by the waterflood. 5. although more than 95 percent of the ammonia is absorbed on the sand or in the fluids, this process does not prevent dis...AE INVESTIGATION OF THE EFFECT OF AMMONIA A?D ANIEES OE THE RECOVERT OF OIL A Thesis EF~ %0 JAMES MP RICHARDSON Submitted to the Graduate School of the Agricultural and Mechanical College of Texas in partial fulfillment of the requirements...
High flux solar energy transformation
Winston, R.; Gleckman, P.L.; O'Gallagher, J.J.
1991-04-09
Disclosed are multi-stage systems for high flux transformation of solar energy allowing for uniform solar intensification by a factor of 60,000 suns or more. Preferred systems employ a focusing mirror as a primary concentrative device and a non-imaging concentrator as a secondary concentrative device with concentrative capacities of primary and secondary stages selected to provide for net solar flux intensification of greater than 2000 over 95 percent of the concentration area. Systems of the invention are readily applied as energy sources for laser pumping and in other photothermal energy utilization processes. 7 figures.
Elliott, Fred C.; Norris, M. J.; Rea, H. E.
1955-01-01
-treat Johnsongrass in cotton in 19 54. Power-driven sprayers normally used for in- tect control in row crops were modified for Yose. A spray pressure of 12 pounds re inch was used. Two systems of the grass were tried. In one system the crenr applying the oil... crown-oilings with naphtha, 83 percent in 7 tests by 3 oil- ings, 95 percent in 6 tests by 4 oilings and 98 percent in 4 tests by 5 to 7 oilings. The use of mixtures of 50 percent naphtha and 50 per- cent kerosene or diesel fuel oil reduced...
McNeely, John G.; Tieken, A. W.
1956-01-01
and timbered hills that are characteristic of the Grand Prairie of Texas with rich sandy loam and black soils. The principal species ni brush and trees are oak, elm, ash, pecan and ce- I dar. I Over 95 percent of the goat producers hav other livestock... (Figure 3). These areas arr dry and hilly with shallow, stony soils, and ha\\t mostly live oak and shin oak brush as vegetatilt overstory. I The goat population increased from 1930 t( I 1940 and spread out of the original producil~p areas. By 1940...
High flux solar energy transformation
Winston, Roland (Chicago, IL); Gleckman, Philip L. (Chicago, IL); O'Gallagher, Joseph J. (Flossmoor, IL)
1991-04-09
Disclosed are multi-stage systems for high flux transformation of solar energy allowing for uniform solar intensification by a factor of 60,000 suns or more. Preferred systems employ a focusing mirror as a primary concentrative device and a non-imaging concentrator as a secondary concentrative device with concentrative capacities of primary and secondary stages selected to provide for net solar flux intensification of greater than 2000 over 95 percent of the concentration area. Systems of the invention are readily applied as energy sources for laser pumping and in other photothermal energy utilization processes.
T2K Collaboration; K. Abe; J. Adam; H. Aihara; T. Akiri; C. Andreopoulos; S. Aoki; A. Ariga; S. Assylbekov; D. Autiero; M. Barbi; G. J. Barker; G. Barr; P. Bartet-Friburg; M. Bass; M. Batkiewicz; F. Bay; V. Berardi; B. E. Berger; S. Berkman; S. Bhadra; F. d. M. Blaszczyk; A. Blondel; S. Bolognesi; S. Bordoni; S. B. Boyd; D. Brailsford; A. Bravar; C. Bronner; N. Buchanan; R. G. Calland; J. Caravaca Rodríguez; S. L. Cartwright; R. Castillo; M. G. Catanesi; A. Cervera; D. Cherdack; N. Chikuma; G. Christodoulou; A. Clifton; J. Coleman; S. J. Coleman; G. Collazuol; K. Connolly; L. Cremonesi; A. Dabrowska; I. Danko; R. Das; S. Davis; P. de Perio; G. De Rosa; T. Dealtry; S. R. Dennis; C. Densham; D. Dewhurst; F. Di Lodovico; S. Di Luise; S. Dolan; O. Drapier; T. Duboyski; K. Duffy; J. Dumarchez; S. Dytman; M. Dziewiecki; S. Emery-Schrenk; A. Ereditato; L. Escudero; T. Feusels; A. J. Finch; G. A. Fiorentini; M. Friend; Y. Fujii; Y. Fukuda; A. P. Furmanski; V. Galymov; A. Garcia; S. Giffin; C. Giganti; K. Gilje; D. Goeldi; T. Golan; M. Gonin; N. Grant; D. Gudin; D. R. Hadley; L. Haegel; A. Haesler; M. D. Haigh; P. Hamilton; D. Hansen; T. Hara; M. Hartz; T. Hasegawa; N. C. Hastings; T. Hayashino; Y. Hayato; C. Hearty; R. L. Helmer; M. Hierholzer; J. Hignight; A. Hillairet; A. Himmel; T. Hiraki; S. Hirota; J. Holeczek; S. Horikawa; F. Hosomi; K. Huang; A. K. Ichikawa; K. Ieki; M. Ieva; M. Ikeda; J. Imber; J. Insler; T. J. Irvine; T. Ishida; T. Ishii; E. Iwai; K. Iwamoto; K. Iyogi; A. Izmaylov; A. Jacob; B. Jamieson; M. Jiang; S. Johnson; J. H. Jo; P. Jonsson; C. K. Jung; M. Kabirnezhad; A. C. Kaboth; T. Kajita; H. Kakuno; J. Kameda; Y. Kanazawa; D. Karlen; I. Karpikov; T. Katori; E. Kearns; M. Khabibullin; A. Khotjantsev; D. Kielczewska; T. Kikawa; A. Kilinski; J. Kim; S. King; J. Kisiel; P. Kitching; T. Kobayashi; L. Koch; T. Koga; A. Kolaceke; A. Konaka; L. L. Kormos; A. Korzenev; Y. Koshio; W. Kropp; H. Kubo; Y. Kudenko; R. Kurjata; T. Kutter; J. Lagoda; I. Lamont; E. Larkin; M. Laveder; M. Lawe; M. Lazos; T. Lindner; C. Lister; R. P. Litchfield; A. Longhin; J. P. Lopez; L. Ludovici; L. Magaletti; K. Mahn; M. Malek; S. Manly; A. D. Marino; J. Marteau; J. F. Martin; P. Martins; S. Martynenko; T. Maruyama; V. Matveev; K. Mavrokoridis; E. Mazzucato; M. McCarthy; N. McCauley; K. S. McFarland; C. McGrew; A. Mefodiev; C. Metelko; M. Mezzetto; P. Mijakowski; C. A. Miller; A. Minamino; O. Mineev; A. Missert; M. Miura; S. Moriyama; Th. A. Mueller; A. Murakami; M. Murdoch; S. Murphy; J. Myslik; T. Nakadaira; M. Nakahata; K. G. Nakamura; K. Nakamura; S. Nakayama; T. Nakaya; K. Nakayoshi; C. Nantais; C. Nielsen; M. Nirkko; K. Nishikawa; Y. Nishimura; J. Nowak; H. M. O'Keeffe; R. Ohta; K. Okumura; T. Okusawa; W. Oryszczak; S. M. Oser; T. Ovsyannikova; R. A. Owen; Y. Oyama; V. Palladino; J. L. Palomino; V. Paolone; D. Payne; O. Perevozchikov; J. D. Perkin; Y. Petrov; L. Pickard; E. S. Pinzon Guerra; C. Pistillo; P. Plonski; E. Poplawska; B. Popov; M. Posiadala-Zezula; J. -M. Poutissou; R. Poutissou; P. Przewlocki; B. Quilain; E. Radicioni; P. N. Ratoff; M. Ravonel; M. A. M. Rayner; A. Redij; M. Reeves; E. Reinherz-Aronis; C. Riccio; P. A. Rodrigues; P. Rojas; E. Rondio; S. Roth; A. Rubbia; D. Ruterbories; A. Rychter; R. Sacco; K. Sakashita; F. Sánchez; F. Sato; E. Scantamburlo; K. Scholberg; S. Schoppmann; J. Schwehr; M. Scott; Y. Seiya; T. Sekiguchi; H. Sekiya; D. Sgalaberna; R. Shah; F. Shaker; D. Shaw; M. Shiozawa; S. Short; Y. Shustrov; P. Sinclair; B. Smith; M. Smy; J. T. Sobczyk; H. Sobel; M. Sorel; L. Southwell; P. Stamoulis; J. Steinmann; B. Still; Y. Suda; A. Suzuki; K. Suzuki; S. Y. Suzuki; Y. Suzuki; R. Tacik; M. Tada; S. Takahashi; A. Takeda; Y. Takeuchi; H. K. Tanaka; H. A. Tanaka; M. M. Tanaka; D. Terhorst; R. Terri; L. F. Thompson; A. Thorley; S. Tobayama; W. Toki; T. Tomura; Y. Totsuka; C. Touramanis; T. Tsukamoto; M. Tzanov; Y. Uchida; A. Vacheret; M. Vagins; G. Vasseur; T. Wachala; K. Wakamatsu; C. W. Walter; D. Wark; W. Warzycha; M. O. Wascko; A. Weber; R. Wendell; R. J. Wilkes; M. J. Wilking; C. Wilkinson; Z. Williamson; J. R. Wilson; R. J. Wilson; T. Wongjirad; Y. Yamada; K. Yamamoto; C. Yanagisawa; T. Yano; S. Yen; N. Yershov; M. Yokoyama; K. Yoshida; T. Yuan; M. Yu; A. Zalewska; J. Zalipska; L. Zambelli; K. Zaremba; M. Ziembicki; E. D. Zimmerman; M. Zito; J. ?muda
2015-03-30
We report on measurements of neutrino oscillation using data from the T2K long-baseline neutrino experiment collected between 2010 and 2013. In an analysis of muon neutrino disappearance alone, we find the following estimates and 68% confidence intervals for the two possible mass hierarchies: Normal Hierarchy: $\\sin^2\\theta_{23}=0.514^{+0.055}_{-0.056}$ and $\\Delta m^2_{32}=(2.51\\pm0.10)\\times 10^{-3}$ eV$^2$/c$^4$ Inverted Hierarchy: $\\sin^2\\theta_{23}=0.511\\pm0.055$ and $\\Delta m^2_{13}=(2.48\\pm0.10)\\times 10^{-3}$ eV$^2$/c$^4$ The analysis accounts for multi-nucleon mechanisms in neutrino interactions which were found to introduce negligible bias. We describe our first analyses that combine measurements of muon neutrino disappearance and electron neutrino appearance to estimate four oscillation parameters and the mass hierarchy. Frequentist and Bayesian intervals are presented for combinations of these parameters, with and without including recent reactor measurements. At 90% confidence level and including reactor measurements, we exclude the region: $\\delta_{CP}=[0.15,0.83]\\pi$ for normal hierarchy and $\\delta_{CP}=[-0.08,1.09]\\pi$ for inverted hierarchy. The T2K and reactor data weakly favor the normal hierarchy with a Bayes Factor of 2.2. The most probable values and 68% 1D credible intervals for the other oscillation parameters, when reactor data are included, are: $\\sin^2\\theta_{23}=0.528^{+0.055}_{-0.038}$ and $|\\Delta m^2_{32}|=(2.51\\pm0.11)\\times 10^{-3}$ eV$^2$/c$^4$.
Aspirin and Statin Nonuse Associated With Early Biochemical Failure After Prostate Radiation Therapy
Zaorsky, Nicholas G.; Buyyounouski, Mark K.; Li, Tianyu; Horwitz, Eric M.
2012-09-01
Purpose: To present the largest retrospective series investigating the effect of aspirin and statins, which are hypothesized to have antineoplastic properties, on biochemical failure (nadir plus 2 ng/mL) after prostate radiation therapy (RT). Methods and Materials: Between 1989 and 2006, 2051 men with clinically localized prostate cancer received definitive RT alone (median dose, 76 Gy). The rates of aspirin use and statin use (defined as any use at the time of RT or during follow-up) were 36% and 34%, respectively. The primary endpoint of the study was an interval to biochemical failure (IBF) of less than 18 months, which has been shown to be the single strongest predictor of distant metastasis, prostate cancer survival, and overall survival after RT. Patient demographic characteristics and tumor staging factors were assessed with regard to associations with the endpoint. Univariate analysis was performed with the {chi}{sup 2} test for categorical variables and the Wilcoxon test for continuous variables. Multivariable analysis was performed with a multiple logistic regression. Results: The median follow-up was 75 months. Univariate analysis showed that an IBF of less than 18 months was associated with aspirin nonuse (P<.0001), statin nonuse (P<.0001), anticoagulant nonuse (P=.0006), cardiovascular disease (P=.0008), and prostate-specific antigen (continuous) (P=.008) but not with Gleason score, age, RT dose, or T stage. On multivariate analysis, only aspirin nonuse (P=.0012; odds ratio, 2.052 [95% confidence interval, 1.328-3.172]) and statin nonuse (P=.0002; odds ratio, 2.465 [95% confidence interval, 1.529-3.974]) were associated with an IBF of less than 18 months. Conclusions: In patients who received RT for prostate cancer, aspirin or statin nonuse was associated with early biochemical failure, a harbinger of distant metastasis and death. Further study is needed to confirm these findings and to determine the optimal dosing and schedule, as well as the relative benefits and risks, of both therapies in combination with RT.
Salvage/Adjuvant Brachytherapy After Ophthalmic Artery Chemosurgery for Intraocular Retinoblastoma
Francis, Jasmine H.; Barker, Christopher A.; Wolden, Suzanne L.; McCormick, Beryl; Segal, Kira; Cohen, Gil; Gobin, Y. Pierre; Marr, Brian P.; Brodie, Scott E.; Dunkel, Ira J.; Abramson, David H.
2013-11-01
Purpose: To evaluate the efficacy and toxicity of brachytherapy after ophthalmic artery chemosurgery (OAC) for retinoblastoma. Methods and Materials: This was a single-arm, retrospective study of 15 eyes in 15 patients treated with OAC followed by brachytherapy at (blinded institution) between May 1, 2006, and December 31, 2012, with a median 19 months' follow-up from plaque insertion. Outcome measurements included patient and ocular survival, visual function, and retinal toxicity measured by electroretinogram (ERG). Results: Brachytherapy was used as adjuvant treatment in 2 eyes and as salvage therapy in 13 eyes of which 12 had localized vitreous seeding. No patients developed metastasis or died of retinoblastoma. The Kaplan-Meier estimate of ocular survival was 79.4% (95% confidence interval 48.7%-92.8%) at 18 months. Three eyes were enucleated, and an additional 6 eyes developed out-of-target volume recurrences, which were controlled with additional treatments. Patients with an ocular complication had a mean interval between last OAC and plaque of 2.5 months (SD 2.3 months), which was statistically less (P=.045) than patients without ocular complication who had a mean interval between last OAC and plaque of 6.5 months (SD 4.4 months). ERG responses from pre- versus postplaque were unchanged or improved in more than half the eyes. Conclusions: Brachytherapy following OAC is effective, even in the presence of vitreous seeding; the majority of eyes maintained stable or improved retinal function following treatment, as assessed by ERG.
Pillow, Jonathan
. Acknowledgments 9. Conclusions · Conditional renewal (CR) process model incorporates real-time and rescaled dependencies between ISIs can also be modeled using conditional renewal densities 4. Time-rescaling theorem of conditional renewal model 8. Application to retinal data 7. Removing serial dependencies 2. Incorporating
Atalar, Ergin
of view, RF = radiofrequency, SPGR = spoiled gradient echo, TE = echo time, TR = repetition time, 3D cancer, laser or radio-frequency (RF) ablation of head and neck tumors, monitoring of prostate cancer agent needed, and no risk of ionizing radiation, MR-guided cardiovascular interventions are still
Bisi, M. M.; Jackson, B. V.; Buffington, A.; Clover, J. M.; Hick, P. P.; Tokumaru, M.
2009-01-01
structure of the fast solar wind. J. Geophys. Res. 112,observations of the solar wind. Proc. SPIE 6689, 668911-1.W.A. , Maagoe, S. : 1972, Solar wind velocity from ips
Lauwereyns, Jan
. We calculated statistics of wind velocities (vertical, longitudinal and lateral) inside the canopy at the study site), for at least 100 seeds per species. We incorporated temporal variation in wind conditions by running the model for all 1,271 half-hour averages of u* and wind direction recorded by the upper
Percentage of Positive Biopsy Cores: A Better Risk Stratification Model for Prostate Cancer?
Huang Jiayi; Vicini, Frank A. [Department of Radiation Oncology, William Beaumont Hospital, Royal Oak, MI (United States); Williams, Scott G. [Peter Maccallum Cancer Centre and University of Melbourne, Melbourne, Victoria (Australia); Ye Hong; McGrath, Samuel; Ghilezan, Mihai; Krauss, Daniel; Martinez, Alvaro A. [Department of Radiation Oncology, William Beaumont Hospital, Royal Oak, MI (United States); Kestin, Larry L., E-mail: lkestin@comcast.net [Department of Radiation Oncology, William Beaumont Hospital, Royal Oak, MI (United States)
2012-07-15
Purpose: To assess the prognostic value of the percentage of positive biopsy cores (PPC) and perineural invasion in predicting the clinical outcomes after radiotherapy (RT) for prostate cancer and to explore the possibilities to improve on existing risk-stratification models. Methods and Materials: Between 1993 and 2004, 1,056 patients with clinical Stage T1c-T3N0M0 prostate cancer, who had four or more biopsy cores sampled and complete biopsy core data available, were treated with external beam RT, with or without a high-dose-rate brachytherapy boost at William Beaumont Hospital. The median follow-up was 7.6 years. Multivariate Cox regression analysis was performed with PPC, Gleason score, pretreatment prostate-specific antigen, T stage, PNI, radiation dose, androgen deprivation, age, prostate-specific antigen frequency, and follow-up duration. A new risk stratification (PPC classification) was empirically devised to incorporate PPC and replace the T stage. Results: On multivariate Cox regression analysis, the PPC was an independent predictor of distant metastasis, cause-specific survival, and overall survival (all p < .05). A PPC >50% was associated with significantly greater distant metastasis (hazard ratio, 4.01; 95% confidence interval, 1.86-8.61), and its independent predictive value remained significant with or without androgen deprivation therapy (all p < .05). In contrast, PNI and T stage were only predictive for locoregional recurrence. Combining the PPC ({<=}50% vs. >50%) with National Comprehensive Cancer Network risk stratification demonstrated added prognostic value of distant metastasis for the intermediate-risk (hazard ratio, 5.44; 95% confidence interval, 1.78-16.6) and high-risk (hazard ratio, 4.39; 95% confidence interval, 1.70-11.3) groups, regardless of the use of androgen deprivation and high-dose RT (all p < .05). The proposed PPC classification appears to provide improved stratification of the clinical outcomes relative to the National Comprehensive Cancer Network classification. Conclusions: The PPC is an independent and powerful predictor of clinical outcomes of prostate cancer after RT. A risk model replacing T stage with the PPC to reduce subjectivity demonstrated potentially improved stratification.
Xu, Xiaofeng; Thornton, Peter E; Post, Wilfred M
2013-01-01
Soil microbes play a pivotal role in regulating land-atmosphere interactions; the soil microbial biomass carbon (C), nitrogen (N), phosphorus (P) and C:N:P stoichiometry are important regulators for soil biogeochemical processes; however, the current knowledge on magnitude, stoichiometry, storage, and spatial distribution of global soil microbial biomass C, N, and P is limited. In this study, 3087 pairs of data points were retrieved from 281 published papers and further used to summarize the magnitudes and stoichiometries of C, N, and P in soils and soil microbial biomass at global- and biome-levels. Finally, global stock and spatial distribution of microbial biomass C and N in 0-30 cm and 0-100 cm soil profiles were estimated. The results show that C, N, and P in soils and soil microbial biomass vary substantially across biomes; the fractions of soil nutrient C, N, and P in soil microbial biomass are 1.6% in a 95% confidence interval of (1.5%-1.6%), 2.9% in a 95% confidence interval of (2.8%-3.0%), and 4.4% in a 95% confidence interval of (3.9%-5.0%), respectively. The best estimates of C:N:P stoichiometries for soil nutrients and soil microbial biomass are 153:11:1, and 47:6:1, respectively, at global scale, and they vary in a wide range among biomes. Vertical distribution of soil microbial biomass follows the distribution of roots up to 1 m depth. The global stock of soil microbial biomass C and N were estimated to be 15.2 Pg C and 2.3 Pg N in the 0-30 cm soil profiles, and 21.2 Pg C and 3.2 Pg N in the 0-100 cm soil profiles. We did not estimate P in soil microbial biomass due to data shortage and insignificant correlation with soil total P and climate variables. The spatial patterns of soil microbial biomass C and N were consistent with those of soil organic C and total N, i.e. high density in northern high latitude, and low density in low latitudes and southern hemisphere.
Thomas C. Chidsey, Jr.
2002-11-01
The Paradox Basin of Utah, Colorado, and Arizona contains nearly 100 small oil fields producing from shallow-shelf carbonate buildups or mounds within the Desert Creek zone of the Pennsylvanian (Desmoinesian) Paradox Formation. These fields typically have one to four wells with primary production ranging from 700,000 to 2,000,000 barrels (111,300-318,000 m{sup 3}) of oil per field at a 15 to 20 percent recovery rate. Five fields in southeastern Utah were evaluated for waterflood or carbon-dioxide (CO{sub 2})-miscible flood projects based upon geological characterization and reservoir modeling. Geological characterization on a local scale focused on reservoir heterogeneity, quality, and lateral continuity as well as possible compartmentalization within each of the five project fields. The Desert Creek zone includes three generalized facies belts: (1) open-marine, (2) shallow-shelf and shelf-margin, and (3) intra-shelf, salinity-restricted facies. These deposits have modern analogs near the coasts of the Bahamas, Florida, and Australia, respectively, and outcrop analogs along the San Juan River of southeastern Utah. The analogs display reservoir heterogeneity, flow barriers and baffles, and lithofacies geometry observed in the fields; thus, these properties were incorporated in the reservoir simulation models. Productive carbonate buildups consist of three types: (1) phylloid algal, (2) coralline algal, and (3) bryozoan. Phylloid-algal buildups have a mound-core interval and a supra-mound interval. Hydrocarbons are stratigraphically trapped in porous and permeable lithotypes within the mound-core intervals of the lower part of the buildups and the more heterogeneous supramound intervals. To adequately represent the observed spatial heterogeneities in reservoir properties, the phylloid-algal bafflestones of the mound-core interval and the dolomites of the overlying supra-mound interval were subdivided into ten architecturally distinct lithotypes, each of which exhibits a characteristic set of reservoir properties obtained from outcrop analogs, cores, and geophysical logs. The Anasazi and Runway fields were selected for geostatistical modeling and reservoir compositional simulations. Models and simulations incorporated variations in carbonate lithotypes, porosity, and permeability to accurately predict reservoir responses. History matches tied previous production and reservoir pressure histories so that future reservoir performances could be confidently predicted. The simulation studies showed that despite most of the production being from the mound-core intervals, there were no corresponding decreases in the oil in place in these intervals. This behavior indicates gravity drainage of oil from the supra-mound intervals into the lower mound-core intervals from which the producing wells' major share of production arises. The key to increasing ultimate recovery from these fields (and similar fields in the basin) is to design either waterflood or CO{sub 2}-miscible flood projects capable of forcing oil from high-storage-capacity but low-recovery supra-mound units into the high-recovery mound-core units. Simulation of Anasazi field shows that a CO{sub 2} flood is technically superior to a waterflood and economically feasible. For Anasazi field, an optimized CO{sub 2} flood is predicted to recover a total 4.21 million barrels (0.67 million m3) of oil representing in excess of 89 percent of the original oil in place. For Runway field, the best CO{sub 2} flood is predicted to recover a total of 2.4 million barrels (0.38 million m3) of oil representing 71 percent of the original oil in place. If the CO{sub 2} flood performed as predicted, it is a financially robust process for increasing the reserves in the many small fields in the Paradox Basin. The results can be applied to other fields in the Rocky Mountain region, the Michigan and Illinois Basins, and the Midcontinent.
The production and certification of a plutonium equal-atom reference material: NBL CRM 128
Crawford, D.W. (Department of Energy, Washington, DC (USA). Office of Safeguards and Security); Gradle, C.G.; Soriano, M.D. (USDOE New Brunswick Lab., Argonne, IL (USA))
1990-07-01
This report describes the design, production, and certification of the New Brunswick Laboratory plutonium equal-atom certified reference material (CRM), NBL CRM 128. The primary use of this CRM is for the determination of bias corrections encountered in the operation of a mass spectrometer. This reference material is available to the US Department of Energy contractor-operated and government-operated laboratories, as well as to the international nuclear safeguards community. The absolute, or unbiased, certified value for the CRM's Pu-242/Pu-239 ratio is 1.00063 {plus minus} 0.00026 (95% confidence interval) as of October 1, 1984. This value was obtained through the quantitative blending of high-purity, chemically and isotopically characterized separated isotopes, as well as through intercomparisons of CRM samples with calibration mixtures using thermal ionization mass spectrometry. 32 tabs.
Medium term municipal solid waste generation prediction by autoregressive integrated moving average
Younes, Mohammad K.; Nopiah, Z. M.; Basri, Noor Ezlin A.; Basri, Hassan
2014-09-12
Generally, solid waste handling and management are performed by municipality or local authority. In most of developing countries, local authorities suffer from serious solid waste management (SWM) problems and insufficient data and strategic planning. Thus it is important to develop robust solid waste generation forecasting model. It helps to proper manage the generated solid waste and to develop future plan based on relatively accurate figures. In Malaysia, solid waste generation rate increases rapidly due to the population growth and new consumption trends that characterize the modern life style. This paper aims to develop monthly solid waste forecasting model using Autoregressive Integrated Moving Average (ARIMA), such model is applicable even though there is lack of data and will help the municipality properly establish the annual service plan. The results show that ARIMA (6,1,0) model predicts monthly municipal solid waste generation with root mean square error equals to 0.0952 and the model forecast residuals are within accepted 95% confident interval.
Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance
Ahn, Tae-Hyuk; Crosskey, JJ; Pan, Chongle
2015-01-01
Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic reads to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. The algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.
Statistical damage identification techniques applied to the I-40 bridge over the Rio Grande River
Doebling, S.W.; Farrar, C.R.
1998-03-01
The statistical significance of vibration-based damage identification parameters is studied via application to the data from the tests performed on the Interstate 40 highway bridge in Albuquerque, New Mexico. A test of statistical significance is applied to the mean and confidence interval estimates of the modal properties and the corresponding damage indicators. The damage indicator used in this study is the change in the measured flexibility matrix. Previously presented deterministic results indicate that damage is detectable in all of the damage cases from these data sets. The results of this study indicate that the changes in both the modal properties and the damage indicators are statistically significant for all of the damage cases. However, these changes are distributed spatially for the first three damage cases and do not localize the damage until the fourth and final damage case.
Physics-based, Bayesian sequential detection method and system for radioactive contraband
Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E
2014-03-18
A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.
Tank 241-AX-103, cores 212 and 214 analytical results for the final report
Steen, F.H.
1998-02-05
This document is the analytical laboratory report for tank 241-AX-103 push mode core segments collected between July 30, 1997 and August 11, 1997. The segments were subsampled and analyzed in accordance with the Tank 241-AX-103 Push Mode Core Sampling and Analysis Plan (TSAP) (Comer, 1997), the Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al., 1995). The analytical results are included in the data summary table (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT), plutonium 239 (Pu239), and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Conner, 1997). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report.
Digital Elevation Model, 0.25 m, Barrow Environmental Observatory, Alaska, 2013
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Cathy Wilson; Garrett Altmann
2015-11-20
This 0.25m horizontal resolution digital elevation model, DEM, was developed from Airborne Laser Altimetry flown by Aerometric Inc, now known as Quantum Spatial, Inc. on 12 July, 2013. One Mission was flown and the data jointly processed with LANL personnel to produce a 0.25m DEM covering a region approximately 2.8km wide and 12.4km long extending from the coast above North Salt Lagoon to south of Gas Well Road. This DEM encompasses a diverse range of hydrologic, geomorphic, geophysical and biological features typical of the Barrow Peninsula. Vertical accuracy at the 95% confidence interval was computed as 0.143m. The coordinate system, datum, and geoid for this DEM are UTM Zone 4N, NAD83 (2011), NAVD88 (GEOID09).
Search for Dijet Resonances in 7 TeV pp Collisions at CMS
Khachatryan, V.; et al.
2010-11-01
A search for narrow resonances in the dijet mass spectrum is performed using data corresponding to an integrated luminosity of 2.9 inverse pb collected by the CMS experiment at the LHC. Upper limits at the 95% confidence level (CL) are presented on the product of the resonance cross section, branching fraction into dijets, and acceptance, separately for decays into quark-quark, quark-gluon, or gluon-gluon pairs. The data exclude new particles predicted in the following models at the 95% CL: string resonances, with mass less than 2.50 TeV, excited quarks, with mass less than 1.58 TeV, and axigluons, colorons, and E_6 diquarks, in specific mass intervals. This extends previously published limits on these models.
High Speed Peltier Calorimeter for the Calibration of High Bandwidth Power Measurement Equipment
Frost, Damien F
2015-01-01
Accurate power measurements of electronic components operating at high frequencies are vital in determining where power losses occur in a system such as a power converter. Such power measurements must be carried out with equipment that can accurately measure real power at high frequency. We present the design of a high speed calorimeter to address this requirement, capable of reaching a steady state in less than 10 minutes. The system uses Peltier thermoelectric coolers to remove heat generated in a load resistance, and was calibrated against known real power measurements using an artificial neural network. A dead zone controller was used to achieve stable power measurements. The calibration was validated and shown to have an absolute accuracy of +/-8 mW (95% confidence interval) for measurements of real power from 0.1 to 5 W.
On the estimation of the extremal index based on scaling and resampling
Hamidieh, Kamal; Michailidis, George
2010-01-01
The extremal index parameter theta characterizes the degree of local dependence in the extremes of a stationary time series and has important applications in a number of areas, such as hydrology, telecommunications, finance and environmental studies. In this study, a novel estimator for theta based on the asymptotic scaling of block-maxima and resampling is introduced. It is shown to be consistent and asymptotically normal for a large class of m-dependent time series. Further, a procedure for the automatic selection of its tuning parameter is developed and different types of confidence intervals that prove useful in practice proposed. The performance of the estimator is examined through simulations, which show its highly competitive behavior. Finally, the estimator is applied to three real data sets of daily crude oil prices, daily returns of the S&P 500 stock index, and high-frequency, intra-day traded volumes of a stock. These applications demonstrate additional diagnostic features of statistical plots ...
Search for the decay modes D??e?e?, D??????, and D??e±??
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Lees, J. P.; Poireau, V.; Tisserand, V.; Garra Tico, J.; Grauges, E.; Palano, A.; Eigen, G.; Stugu, B.; Brown, D. N.; Kerth, L. T.; et al
2012-08-01
We present searches for the rare decay modes D??e?e?, D0?????, and D??e±?? in continuum e?e??cc¯ events recorded by the BABAR detector in a data sample that corresponds to an integrated luminosity of 468 fb?¹. These decays are highly Glashow–Iliopoulos–Maiani suppressed but may be enhanced in several extensions of the standard model. Our observed event yields are consistent with the expected backgrounds. An excess is seen in the D?????? channel, although the observed yield is consistent with an upward background fluctuation at the 5% level. Using the Feldman–Cousins method, we set the following 90% confidence level intervals on the branching fractions:more »B(D??e?e?)±??)« less
Search for the decay modes D??e?e?, D??????, and D??e^{±}??
Lees, J. P.; Poireau, V.; Tisserand, V.; Garra Tico, J.; Grauges, E.; Palano, A.; Eigen, G.; Stugu, B.; Brown, D. N.; Kerth, L. T.; Kolomensky, Yu. G.; Lynch, G.; Koch, H.; Schroeder, T.; Asgeirsson, D. J.; Hearty, C.; Mattison, T. S.; McKenna, J. A.; So, R. Y.; Khan, A.; Blinov, V. E.; Buzykaev, A. R.; Druzhinin, V. P.; Golubev, V. B.; Kravchenko, E. A.; Onuchin, A. P.; Serednyakov, S. I.; Skovpen, Yu. I.; Solodov, E. P.; Todyshev, K. Yu.; Yushkov, A. N.; Bondioli, M.; Kirkby, D.; Lankford, A. J.; Mandelkern, M.; Atmacan, H.; Gary, J. W.; Liu, F.; Long, O.; Mullin, E.; Vitug, G. M.; Campagnari, C.; Hong, T. M.; Kovalskyi, D.; Richman, J. D.; West, C. A.; Eisner, A. M.; Kroseberg, J.; Lockman, W. S.; Martinez, A. J.; Schumm, B. A.; Seiden, A.; Chao, D. S.; Cheng, C. H.; Echenard, B.; Flood, K. T.; Hitlin, D. G.; Ongmongkolkul, P.; Porter, F. C.; Rakitin, A. Y.; Andreassen, R.; Huard, Z.; Meadows, B. T.; Sokoloff, M. D.; Sun, L.; Bloom, P. C.; Ford, W. T.; Gaz, A.; Nauenberg, U.; Smith, J. G.; Wagner, S. R.; Ayad, R.; Toki, W. H.; Spaan, B.; Schubert, K. R.; Schwierz, R.; Bernard, D.; Verderi, M.; Clark, P. J.; Playfer, S.; Bettoni, D.; Bozzi, C.; Calabrese, R.; Cibinetto, G.; Fioravanti, E.; Garzia, I.; Luppi, E.; Munerato, M.; Piemontese, L.; Santoro, V.; Baldini-Ferroli, R.; Calcaterra, A.; de Sangro, R.; Finocchiaro, G.; Patteri, P.; Peruzzi, I. M.; Piccolo, M.; Rama, M.; Zallo, A.; Contri, R.; Guido, E.; Lo Vetere, M.; Monge, M. R.; Passaggio, S.; Patrignani, C.; Robutti, E.; Bhuyan, B.; Prasad, V.; Lee, C. L.; Morii, M.; Edwards, A. J.; Adametz, A.; Uwer, U.; Lacker, H. M.; Lueck, T.; Dauncey, P. D.; Mallik, U.; Chen, C.; Cochran, J.; Meyer, W. T.; Prell, S.; Rubin, A. E.; Gritsan, A. V.; Guo, Z. J.; Arnaud, N.; Davier, M.; Derkach, D.; Grosdidier, G.; Le Diberder, F.; Lutz, A. M.; Malaescu, B.; Roudeau, P.; Schune, M. H.; Stocchi, A.; Wormser, G.; Lange, D. J.; Wright, D. M.; Chavez, C. A.; Coleman, J. P.; Fry, J. R.; Gabathuler, E.; Hutchcroft, D. E.; Payne, D. J.; Touramanis, C.; Bevan, A. J.; Di Lodovico, F.; Sacco, R.; Sigamani, M.; Cowan, G.; Brown, D. N.; Davis, C. L.; Denig, A. G.; Fritsch, M.; Gradl, W.; Griessinger, K.; Hafner, A.; Prencipe, E.; Barlow, R. J.; Jackson, G.; Lafferty, G. D.; Behn, E.; Cenci, R.; Hamilton, B.; Jawahery, A.; Roberts, D. A.; Dallapiccola, C.; Cowan, R.; Dujmic, D.; Sciolla, G.; Cheaib, R.; Lindemann, D.; Patel, P. M.; Robertson, S. H.; Biassoni, P.; Neri, N.; Palombo, F.; Stracka, S.; Cremaldi, L.; Godang, R.; Kroeger, R.; Sonnek, P.; Summers, D. J.; Nguyen, X.; Simard, M.; Taras, P.; De Nardo, G.; Monorchio, D.; Onorato, G.; Sciacca, C.; Martinelli, M.; Raven, G.; Jessop, C. P.; LoSecco, J. M.; Wang, W. F.; Honscheid, K.; Kass, R.; Brau, J.; Frey, R.; Sinev, N. B.; Strom, D.; Torrence, E.; Feltresi, E.; Gagliardi, N.; Margoni, M.; Morandin, M.; Posocco, M.; Rotondo, M.; Simi, G.; Simonetto, F.; Stroili, R.; Akar, S.; Ben-Haim, E.; Bomben, M.; Bonneaud, G. R.; Briand, H.; Calderini, G.; Chauveau, J.; Hamon, O.; Leruste, Ph.; Marchiori, G.; Ocariz, J.; Sitt, S.; Biasini, M.; Manoni, E.; Pacetti, S.; Rossi, A.; Angelini, C.; Batignani, G.; Bettarini, S.; Carpinelli, M.; Casarosa, G.; Cervelli, A.; Forti, F.; Giorgi, M. A.; Lusiani, A.; Oberhof, B.; Paoloni, E.; Perez, A.; Rizzo, G.; Walsh, J. J.; Lopes Pegna, D.; Olsen, J.; Smith, A. J. S.; Telnov, A. V.; Anulli, F.; Faccini, R.; Ferrarotto, F.; Ferroni, F.; Gaspero, M.; Li Gioi, L.; Mazzoni, M. A.; Piredda, G.; Bünger, C.; Grünberg, O.; Hartmann, T.; Leddig, T.; Schröder, H.; Voss, C.; Waldi, R.; Adye, T.; Olaiya, E. O.; Wilson, F. F.; Emery, S.; Hamel de Monchenault, G.; Vasseur, G.; Yèche, Ch.; Aston, D.; Bard, D. J.; Bartoldus, R.; Benitez, J. F.; Cartaro, C.; Convery, M. R.; Dorfan, J.; Dubois-Felsmann, G. P.; Dunwoodie, W.; Ebert, M.; Field, R. C.; Franco Sevilla, M.; Fulsom, B. G.; Gabareen, A. M.; Graham, M. T.; Grenier, P.; Hast, C.; Innes, W. R.; Kelsey, M. H.; Kim, P.; Kocian, M. L.; Leith, D. W. G. S.; Lewis, P.; Lindquist, B.; Luitz, S.; Luth, V.; Lynch, H. L.; MacFarlane, D. B.; Muller, D. R.; Neal, H.; Nelson, S.; Perl, M.; Pulliam, T.; Ratcliff, B. N.; Roodman, A.; Salnikov, A. A.; Schindler, R. H.; Snyder, A.; Su, D.; Sullivan, M. K.; Va’vra, J.; Wagner, A. P.; Wisniewski, W. J.; Wittgen, M.; Wright, D. H.; Wulsin, H. W.; Young, C. C.; Ziegler, V.; Park, W.; Purohit, M. V.; White, R. M.; Wilson, J. R.; Randle-Conde, A.; Sekula, S. J.; Bellis, M.; Burchat, P. R.; Miyashita, T. S.; Puccio, E. M. T.; Alam, M. S.; Ernst, J. A.; Gorodeisky, R.; Guttman, N.; Peimer, D. R.; Soffer, A.; Lund, P.; Spanier, S. M.; Ritchie, J. L.; Ruland, A. M.; Schwitters, R. F.; Wray, B. C.; Izen, J. M.; Lou, X. C.; Bianchi, F.; Gamba, D.; Zambito, S.; Lanceri, L.; Vitale, L.; Martinez-Vidal, F.
2012-08-01
We present searches for the rare decay modes D??e?e?, D0?????, and D??e^{±}?^{?} in continuum e?e??cc¯ events recorded by the BABAR detector in a data sample that corresponds to an integrated luminosity of 468 fb?¹. These decays are highly Glashow–Iliopoulos–Maiani suppressed but may be enhanced in several extensions of the standard model. Our observed event yields are consistent with the expected backgrounds. An excess is seen in the D?????? channel, although the observed yield is consistent with an upward background fluctuation at the 5% level. Using the Feldman–Cousins method, we set the following 90% confidence level intervals on the branching fractions: B(D??e?e?)<1.7×10??, B(D??????) within [0.6,8.1]×10??, and B(D??e_{±}?_{?})<3.3×10??.
Huddart, Robert A.; Hall, Emma; Hussain, Syed A.; Jenkins, Peter; Rawlings, Christine; Tremlett, Jean; Crundwell, Malcolm; Adab, Fawzi A.; Sheehan, Denise; Syndikus, Isabel; Hendron, Carey; Lewis, Rebecca; Waters, Rachel; James, Nicholas D.
2013-10-01
Purpose: To test whether reducing radiation dose to uninvolved bladder while maintaining dose to the tumor would reduce side effects without impairing local control in the treatment of muscle-invasive bladder cancer. Methods and Materials: In this phase III multicenter trial, 219 patients were randomized to standard whole-bladder radiation therapy (sRT) or reduced high-dose volume radiation therapy (RHDVRT) that aimed to deliver full radiation dose to the tumor and 80% of maximum dose to the uninvolved bladder. Participants were also randomly assigned to receive radiation therapy alone or radiation therapy plus chemotherapy in a partial 2 × 2 factorial design. The primary endpoints for the radiation therapy volume comparison were late toxicity and time to locoregional recurrence (with a noninferiority margin of 10% at 2 years). Results: Overall incidence of late toxicity was less than predicted, with a cumulative 2-year Radiation Therapy Oncology Group grade 3/4 toxicity rate of 13% (95% confidence interval 8%, 20%) and no statistically significant differences between groups. The difference in 2-year locoregional recurrence free rate (RHDVRT ? sRT) was 6.4% (95% confidence interval ?7.3%, 16.8%) under an intention to treat analysis and 2.6% (?12.8%, 14.6%) in the “per-protocol” population. Conclusions: In this study RHDVRT did not result in a statistically significant reduction in late side effects compared with sRT, and noninferiority of locoregional control could not be concluded formally. However, overall low rates of clinically significant toxicity combined with low rates of invasive bladder cancer relapse confirm that (chemo)radiation therapy is a valid option for the treatment of muscle-invasive bladder cancer.
Feng, Felix Y.; Blas, Kevin; Olson, Karin; Stenmark, Matthew; Sandler, Howard; Hamstra, Daniel A.
2013-05-01
Purpose: To evaluate the role of androgen deprivation therapy (ADT) and duration for high-risk prostate cancer patients treated with dose-escalated radiation therapy (RT). Methods and Materials: A retrospective analysis of high-risk prostate cancer patients treated with dose-escalated RT (minimum 75 Gy) with or without ADT was performed. The relationship between ADT use and duration with biochemical failure (BF), metastatic failure (MF), prostate cancer-specific mortality (PCSM), non-prostate cancer death (NPCD), and overall survival (OS) was assessed as a function of pretreatment characteristics, comorbid medical illness, and treatment using Fine and Gray's cumulative incidence methodology. Results: The median follow-up time was 64 months. In men with National Comprehensive Cancer Network defined high-risk prostate cancer treated with dose-escalated RT, on univariate analysis, both metastasis (P<.0001; hazard ratio 0.34; 95% confidence interval 0.18-0.67; cumulative incidence at 60 months 13% vs 35%) and PCSM (P=.015; hazard ratio 0.41; 95% confidence interval 0.2-1.0; cumulative incidence at 60 months 6% vs 11%) were improved with the use of ADT. On multivariate analysis for all high-risk patients, Gleason score was the strongest negative prognostic factor, and long-term ADT (LTAD) improved MF (P=.002), PCSM (P=.034), and OS (P=.001). In men with prostate cancer and Gleason scores 8 to 10, on multivariate analysis after adjustment for other risk features, there was a duration-dependent improvement in BF, metastasis, PCSM, and OS, all favoring LTAD in comparison with STAD or RT alone. Conclusion: For men with high-risk prostate cancer treated with dose-escalated EBRT, this retrospective study suggests that the combination of LTAD and RT provided a significant improvement in clinical outcome, which was especially true for those with Gleason scores of 8 to 10.
Vulto, Johanna C.M. [Comprehensive Cancer Centre South (IKZ), Eindhoven Cancer Registry, Eindhoven (Netherlands)], E-mail: ansvulto@home.nl; Lybeert, Marnix L.M. [Department of Radiotherapy, Catharina Hospital, Eindhoven (Netherlands); Louwman, Marieke W.J. [Comprehensive Cancer Centre South (IKZ), Eindhoven Cancer Registry, Eindhoven (Netherlands); Poortmans, Philip M.P. [Dr. Bernard Verbeeten Institute, Tilburg (Netherlands); Coebergh, Jan Willem W. [Comprehensive Cancer Centre South (IKZ), Eindhoven Cancer Registry, Eindhoven (Netherlands); Department of Public Health, Erasmus University Medical Centre, Rotterdam (Netherlands)
2009-06-01
Purpose: To explore current variations in the use of primary radiotherapy (RT) in a region with two RT departments with adjacent referral areas, one in the eastern and one in the western sector of the southern region of the Netherlands. Methods and Materials: We calculated the proportion of 147,588 patients with newly diagnosed cancer between 1988 and 2006 in the southern Netherlands who received primary RT. Especially for breast and rectal cancer patients we studied primary RT use according to stage (breast cancer) and age and separately for the eastern and western sectors. Results: The number of patients with new diagnoses receiving primary RT increased from 1,668 patients in 1988 to 2,971 patients in 2006, with the proportion of the overall patients receiving RT remaining more or less unchanged ({+-}30%). However, only 20% of elderly patients (75+ years) received primary RT. Over time, more patients with prostate and rectal cancer, fewer patients with lung and bladder cancer or Hodgkin's lymphoma, and, recently, more patients with cervical or endometrial cancer received RT. The proportion of patients with most other tumor types treated with RT remained more or less unchanged. The total RT rate was slightly higher for patients in the eastern sector. Of particular note, patients with breast or rectal cancer in the eastern sector were significantly more likely to receive primary RT than were their counterparts in the western sector (odds ratio = 1.4, 95% confidence interval =1.4-1.5, and odds ratio = 1.4, 95% confidence interval = 1.3-1.6, respectively). Conclusions: Although the number of RT-treated patients increased substantially during 1988 to 2006, the proportion remained essentially unchanged. In addition, large variations were found in referral rates for RT, especially in later years, between the eastern and the western sectors of the region.
Proton Radiotherapy for Parameningeal Rhabdomyosarcoma: Clinical Outcomes and Late Effects
Childs, Stephanie K. [Department of Radiation Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Kozak, Kevin R. [Department of Radiation Oncology, University of Wisconsin Cancer Center Johnson Creek, Madison, WI (United States); Friedmann, Alison M. [Department of Pediatric Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Yeap, Beow Y. [Department of Medicine, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Adams, Judith; MacDonald, Shannon M.; Liebsch, Norbert J.; Tarbell, Nancy J. [Department of Radiation Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Yock, Torunn I., E-mail: tyock@partners.org [Department of Radiation Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States)
2012-02-01
Purpose: To report the clinical outcome and late side effect profile of proton radiotherapy in the treatment of children with parameningeal rhabdomyosarcoma (PM-RMS). Methods and Materials: Seventeen consecutive children with PM-RMS were treated with proton radiotherapy at Massachusetts General Hospital between 1996 and 2005. We reviewed the medical records of all patients and asked referring physicians to report specific side effects of interest. Results: Median patient age at diagnosis was 3.4 years (range, 0.4-17.6). Embryonal (n = 11), alveolar (n = 4), and undifferentiated (n = 2) histologies were represented. Ten patients (59%) had intracranial extension. Median prescribed dose was 50.4 cobalt gray equivalents (GyRBE) (range, 50.4-56.0 GyRBE) delivered in 1.8-2.0-GyRBE daily fractions. Median follow-up was 5.0 years for survivors. The 5-year failure-free survival estimate was 59% (95% confidence interval, 33-79%), and overall survival estimate was 64% (95% confidence interval, 37-82%). Among the 7 patients who failed, sites of first recurrence were local only (n = 2), regional only (n = 2), distant only (n = 2), and local and distant (n = 1). Late effects related to proton radiotherapy in the 10 recurrence-free patients (median follow-up, 5 years) include failure to maintain height velocity (n = 3), endocrinopathies (n = 2), mild facial hypoplasia (n = 7), failure of permanent tooth eruption (n = 3), dental caries (n = 5), and chronic nasal/sinus congestion (n = 2). Conclusions: Proton radiotherapy for patients with PM-RMS yields tumor control and survival comparable to that in historical controls with similar poor prognostic factors. Furthermore, rates of late effects from proton radiotherapy compare favorably to published reports of photon-treated cohorts.
Lamas, Maria J.
2012-01-01
Purpose: 5-Fluorouracil-based chemoradiotherapy before total mesorectal excision is currently the standard treatment of Stage II and III rectal cancer patients. We used known predictive pharmacogenetic biomarkers to identify the responders to preoperative chemoradiotherapy in our series. Methods and Materials: A total of 93 Stage II-III rectal cancer patients were genotyped using peripheral blood samples. The genes analyzed were X-ray cross-complementing group 1 (XRCC1), ERCC1, MTHFR, EGFR, DPYD, and TYMS. The patients were treated with 225 mg/m{sup 2}/d continuous infusion of 5-fluorouracil concomitantly with radiotherapy (50.4 Gy) followed by total mesorectal excision. The outcomes were measured by tumor regression grade (TRG) as a major response (TRG 1 and TRG 2) or as a poor response (TRG3, TRG4, and TRG5). Results: The major histopathologic response rate was 47.3%. XRCC1 G/G carriers had a greater probability of response than G/A carriers (odds ratio, 4.18; 95% confidence interval, 1.62-10.74, p = .003) Patients with polymorphisms associated with high expression of thymidylate synthase (2R/3G, 3C/3G, and 3G/3G) showed a greater pathologic response rate compared with carriers of low expression (odds ratio, 2.65; 95% confidence interval, 1.10-6.39, p = .02) No significant differences were seen in the response according to EGFR, ERCC1, MTHFR{sub C}677 and MTHFR{sub A}1298 expression. Conclusions: XRCC1 G/G and thymidylate synthase (2R/3G, 3C/3G, and 3G/3G) are independent factors of a major response. Germline thymidylate synthase and XRCC1 polymorphisms might be useful as predictive markers of rectal tumor response to neoadjuvant chemoradiotherapy with 5-fluorouracil.
CANDIDATE PLANETS IN THE HABITABLE ZONES OF KEPLER STARS
Gaidos, Eric
2013-06-20
A key goal of the Kepler mission is the discovery of Earth-size transiting planets in ''habitable zones'' where stellar irradiance maintains a temperate climate on an Earth-like planet. Robust estimates of planet radius and irradiance require accurate stellar parameters, but most Kepler systems are faint, making spectroscopy difficult and prioritization of targets desirable. The parameters of 2035 host stars were estimated by Bayesian analysis and the probabilities p{sub HZ} that 2738 candidate or confirmed planets orbit in the habitable zone were calculated. Dartmouth Stellar Evolution Program models were compared to photometry from the Kepler Input Catalog, priors for stellar mass, age, metallicity and distance, and planet transit duration. The analysis yielded probability density functions for calculating confidence intervals of planet radius and stellar irradiance, as well as p{sub HZ}. Sixty-two planets have p{sub HZ} > 0.5 and a most probable stellar irradiance within habitable zone limits. Fourteen of these have radii less than twice the Earth; the objects most resembling Earth in terms of radius and irradiance are KOIs 2626.01 and 3010.01, which orbit late K/M-type dwarf stars. The fraction of Kepler dwarf stars with Earth-size planets in the habitable zone ({eta}{sub Circled-Plus }) is 0.46, with a 95% confidence interval of 0.31-0.64. Parallaxes from the Gaia mission will reduce uncertainties by more than a factor of five and permit definitive assignments of transiting planets to the habitable zones of Kepler stars.
Methods for measuring lead concentrations in paint films
McKnight, M.E.; Byrd, W.E.; Roberts, W.E.; Lagergren, E.S.
1989-12-01
Recent legislation required the U.S. Department of Housing and Urban Development (HUD) to establish procedures to abate lead-based paint in existing HUD-assisted housing. The legislation also required HUD to assess the accuracy, precision, reliability, and safety of methods for measuring lead content of paint films and to investigate the availability of testers and samplers. The National Institute of Standards and Technology was requested to carry out the assessment. With regard to accuracy and precision of field measurements, it was concluded that: chemical spot tests when carried out by an experienced analytical chemistry technician can detect the presence of lead in paint films having concentrations in excess of 1 mg/sq cm about 90% of the time; the estimate of the precision of a field measurement procedure using lead-specific portable X-ray fluorescence (XRF) analyzers for lead concentrations near 1 mg/sq cm is + or - 0.6 mg/sq cm and the estimate of the bias is 0.2 mg/sq cm; this results in a 95% confidence interval of + or - 1.4 mg/sq cm; and based upon very preliminary measurements using the latest version of the spectrum analyzer portable XRF, the 95% confidence interval for field measurements is estimated to be + or - 0.5 mg/sq cm. In addition to field methods, standard laboratory procedures can be used to measure the lead content of paint samples to within a few percent of the quantity present over a wide range extending from less than 0.1 to over 10 mg/sq cm. Sample collection and sample dissolution procedures were also investigated.
Chen, Pei-Chun [Department of Statistics and Informatics Science, Providence University, Taiwan (China); Chen, Yen-Ching [Institute of Epidemiology Preventive Medicine, College of Public Health, National Taiwan University, Taiwan (China); Research Center for Gene, Environment, and Human Health, College of Public Health, National Taiwan University, Taiwan (China); Department of Public Health, Institute of Epidemiology, National Taiwan University, Taiwan (China); Lai, Liang-Chuan [Graduate Institute of Physiology, National Taiwan University, Taiwan (China); Tsai, Mong-Hsun [Institute of Biotechnology, National Taiwan University, Taiwan (China); Chen, Shin-Kuang [National Clinical Trial and Research Center, National Taiwan University Hospital, Taiwan (China); Yang, Pei-Wen; Lee, Yung-Chie [Department of Surgery, National Taiwan University Hospital, Taiwan (China); Hsiao, Chuhsing K. [Research Center for Gene, Environment, and Human Health, College of Public Health, National Taiwan University, Taiwan (China); Department of Public Health, Institute of Epidemiology, National Taiwan University, Taiwan (China); Bioinformatics and Biostatistics Core, Research Center for Medical Excellence, National Taiwan University, Taiwan (China); Lee, Jang-Ming, E-mail: jangming@ntuh.gov.tw [Department of Surgery, National Taiwan University Hospital, Taiwan (China); Chuang, Eric Y., E-mail: chuangey@ntu.edu.tw [National Clinical Trial and Research Center, National Taiwan University Hospital, Taiwan (China); Bioinformatics and Biostatistics Core, Research Center for Medical Excellence, National Taiwan University, Taiwan (China); Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taiwan (China)
2012-04-01
Purpose: To identify germline polymorphisms to predict concurrent chemoradiation therapy (CCRT) response in esophageal cancer patients. Materials and Methods: A total of 139 esophageal cancer patients treated with CCRT (cisplatin-based chemotherapy combined with 40 Gy of irradiation) and subsequent esophagectomy were recruited at the National Taiwan University Hospital between 1997 and 2008. After excluding confounding factors (i.e., females and patients aged {>=}70 years), 116 patients were enrolled to identify single nucleotide polymorphisms (SNPs) associated with specific CCRT responses. Genotyping arrays and mass spectrometry were used sequentially to determine germline polymorphisms from blood samples. These polymorphisms remain stable throughout disease progression, unlike somatic mutations from tumor tissues. Two-stage design and additive genetic models were adopted in this study. Results: From the 26 SNPs identified in the first stage, 2 SNPs were found to be significantly associated with CCRT response in the second stage. Single nucleotide polymorphism rs16863886, located between SGPP2 and FARSB on chromosome 2q36.1, was significantly associated with a 3.93-fold increase in pathologic complete response to CCRT (95% confidence interval 1.62-10.30) under additive models. Single nucleotide polymorphism rs4954256, located in ZRANB3 on chromosome 2q21.3, was associated with a 3.93-fold increase in pathologic complete response to CCRT (95% confidence interval 1.57-10.87). The predictive accuracy for CCRT response was 71.59% with these two SNPs combined. Conclusions: This is the first study to identify germline polymorphisms with a high accuracy for predicting CCRT response in the treatment of esophageal cancer.
Silva, Priyamal; West, Catharine M.; Slevin, Nick F.R.C.R.; Valentine, Helen; Ryder, W. David J. Grad. I.S.; Hampson, Lynne; Bibi, Rufzan; Sloan, Philip; Thakker, Nalin; Homer, Jarrod; Hampson, Ian
2007-09-01
Purpose: Vaults are multi-subunit structures that may be involved in nucleo-cytoplasmic transport, with the major vault protein (MVP or lung resistance-related protein [LRP]) being the main component. The MVP gene is located on chromosome 16 close to the multidrug resistance-associated protein and protein kinase c-{beta} genes. The role of MVP in cancer drug resistance has been demonstrated in various cell lines as well as in ovarian carcinomas and acute myeloid leukemia, but nothing is known about its possible role in radiation resistance. Our aim was to examine this in head-and-neck squamous cell carcinoma (HNSCC). Methods and Materials: Archived biopsy material was obtained for 78 patients with squamous cell carcinoma of the oropharynx who received primary radiotherapy with curative intent. Immunohistochemistry was used to detect MVP expression. Locoregional failure and cancer-specific survival were estimated using cumulative incidence and Cox multivariate analyses. Results: In a univariate and multivariate analysis, MVP expression was strongly associated with both locoregional failure and cancer-specific survival. After adjustment for disease site, stage, grade, anemia, smoking, alcohol, gender, and age, the estimated hazard ratio for high MVP (2/3) compared with low (0/1) was 4.98 (95% confidence interval, 2.17-11.42; p 0.0002) for locoregional failure and 4.28 (95% confidence interval, 1.85-9.95; p = 0.001) for cancer-specific mortality. Conclusion: These data are the first to show that MVP may be a useful prognostic marker associated with radiotherapy resistance in a subgroup of patients with HNSCC.
Clinical Evaluation of Stereotactic Target Localization Using 3-Tesla MRI for Radiosurgery Planning
MacFadden, Derek [University of Toronto Faculty of Medicine, Toronto, ON (Canada); Zhang Beibei; Brock, Kristy K. [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Hodaie, Mojgan [Division of Neurosurgery, Toronto Western Hospital, Toronto, ON (Canada); Laperriere, Normand [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Department of Radiation Oncology, University of Toronto, Toronto, ON (Canada); Schwartz, Michael [Division of Neurosurgery, Toronto Western Hospital, Toronto, ON (Canada); Tsao, May [Department of Radiation Oncology, University of Toronto, Toronto, ON (Canada); Department of Radiation Oncology, Sunnybrook Regional Cancer Centre, Toronto, ON (Canada); Stainsby, Jeffrey [Applied Science Laboratories, GE Healthcare, Mississauga, ON (Canada); Lockwood, Gina [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Mikulis, David [Department of Medical Imaging, University Health Network, Toronto, ON (Canada); Menard, Cynthia, E-mail: cynthia.menard@rmp.uhn.on.c [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Department of Radiation Oncology, University of Toronto, Toronto, ON (Canada)
2010-04-15
Purpose: Increasing the magnetic resonance imaging (MRI) field strength can improve image resolution and quality, but concerns remain regarding the influence on geometric fidelity. The objectives of the present study were to spatially investigate the effect of 3-Tesla (3T) MRI on clinical target localization for stereotactic radiosurgery. Methods and Materials: A total of 39 patients were enrolled in a research ethics board-approved prospective clinical trial. Imaging (1.5T and 3T MRI and computed tomography) was performed after stereotactic frame placement. Stereotactic target localization at 1.5T vs. 3T was retrospectively analyzed in a representative cohort of patients with tumor (n = 4) and functional (n = 5) radiosurgical targets. The spatial congruency of the tumor gross target volumes was determined by the mean discrepancy between the average gross target volume surfaces at 1.5T and 3T. Reproducibility was assessed by the displacement from an averaged surface and volume congruency. Spatial congruency and the reproducibility of functional radiosurgical targets was determined by comparing the mean and standard deviation of the isocenter coordinates. Results: Overall, the mean absolute discrepancy across all patients was 0.67 mm (95% confidence interval, 0.51-0.83), significantly <1 mm (p < .010). No differences were found in the overall interuser target volume congruence (mean, 84% for 1.5T vs. 84% for 3T, p > .4), and the gross target volume surface mean displacements were similar within and between users. The overall average isocenter coordinate discrepancy for the functional targets at 1.5T and 3T was 0.33 mm (95% confidence interval, 0.20-0.48), with no patient-specific differences between the mean values (p >.2) or standard deviations (p >.1). Conclusion: Our results have provided clinically relevant evidence supporting the spatial validity of 3T MRI for use in stereotactic radiosurgery under the imaging conditions used.
The Fermi-GBM X-ray burst monitor: thermonuclear bursts from 4U 0614+09
Linares, M; Jenke, P; van der Horst, A J; Camero-Arranz, A; Kouveliotou, C; Chakrabarty, D; Beklen, E; Bhat, P N; Briggs, M S; Finger, M; Paciesas, W; Preece, R; von Kienlin, A; Wilson-Hodge, C A
2012-01-01
Thermonuclear bursts from slowly accreting neutron stars (NSs) have proven difficult to detect, yet they are potential probes of the thermal properties of the neutron star interior. During the first year of a systematic all-sky search for X-ray bursts using the Gamma-ray Burst Monitor (GBM) aboard the Fermi Gamma-ray Space Telescope we have detected 15 thermonuclear bursts from the NS low-mass X-ray binary 4U 0614+09, when it was accreting at nearly 1% of the Eddington limit. We measured an average burst recurrence time of 12+/-3 d (68% confidence interval) between March 2010 and March 2011, classified all bursts as normal duration bursts and placed a lower limit on the recurrence time of long/intermediate bursts of 62 d (95% confidence level). We discuss how observations of thermonuclear bursts in the hard X-ray band compare to pointed soft X-ray observations, and quantify such bandpass effects on measurements of burst radiated energy and duration. We put our results for 4U 0614+09 in the context of other bu...
Detecting an association between Gamma Ray and Gravitational Wave Bursts
Lee Samuel Finn; Soumya D. Mohanty; Joseph D. Romano
1999-03-30
If $\\gamma$-ray bursts (GRBs) are accompanied by gravitational wave bursts (GWBs) the correlated output of two gravitational wave detectors evaluated in the moments just prior to a GRB will differ from that evaluated at times not associated with a GRB. We can test for this difference independently of any model of the GWB signal waveform. If we invoke a model for the GRB source population and GWB radiation spectral density we can find a confidence interval or upper limit on the root-mean-square GWB signal amplitude in the detector waveband. To illustrate we adopt a simple, physically motivated model and estimate that initial LIGO detector observations coincident with 1000 GRBs could lead us to exclude, with 95% confidence, associated GWBs with $h_{RMS} \\gtrsim 1.7 \\times 10^{-22}$. This result does not require the detector noise be Gaussian or that any inter-detector correlated noise be measured or measurable; it does not require advanced or a priori knowledge of the source waveform; and the limits obtained on the wave-strength improve with the number of observed GRBs.
2013-01-21
Oak Ridge Associated Universities (ORAU), under the Oak Ridge Institute for Science and Education (ORISE) contract, collected split surface water samples with Nuclear Fuel Services (NFS) representatives on November 15, 2012. Representatives from the U.S. Nuclear Regulatory Commission and Tennessee Department of Environment and Conservation were also in attendance. Samples were collected at four surface water stations, as required in the approved Request for Technical Assistance number 11-018. These stations included Nolichucky River upstream (NRU), Nolichucky River downstream (NRD), Martin Creek upstream (MCU), and Martin Creek downstream (MCD). Both ORAU and NFS performed gross alpha and gross beta analyses, and the results are compared using the duplicate error ratio (DER), also known as the normalized absolute difference. A DER {<=} 3 indicates that, at a 99% confidence interval, split sample results do not differ significantly when compared to their respective one standard deviation (sigma) uncertainty (ANSI N42.22). The NFS split sample report does not specify the confidence level of reported uncertainties (NFS 2012). Therefore, standard two sigma reporting is assumed and uncertainty values were divided by 1.96. In conclusion, all DER values were less than 3 and results are consistent with low (e.g., background) concentrations.
Liu, Y.; Hoeksema, J. T.; Sun, X.
2014-03-01
Magnetic twist in solar active regions (ARs) has been found to have a hemispheric preference in sign (hemisphere rule): negative in the northern hemisphere and positive in the southern. The preference reported in previous studies ranges greatly, from ? 58% to 82%. In this study, we examine this hemispheric preference using vector magnetic field data taken by Helioseismic and Magnetic Imager and find that 75% ± 7% of 151 ARs studied obey the hemisphere rule, well within the preference range in previous studies. If the sample is divided into two groups—ARs having magnetic twist and writhe of the same sign and having opposite signs—the strength of the hemispheric preference differs substantially: 64% ± 11% for the former group and 87% ± 8% for the latter. This difference becomes even more significant in a sub-sample of 82 ARs having a simple bipole magnetic configuration: 56% ± 16% for the ARs having the same signs of twist and writhe, and 93% with lower and upper confidence bounds of 80% and 98% for the ARs having the opposite signs. The error reported here is a 95% confidence interval. This may suggest that, prior to emergence of magnetic tubes, either the sign of twist does not have a hemispheric preference or the twist is relatively weak.
ON COMPUTING UPPER LIMITS TO SOURCE INTENSITIES
Kashyap, Vinay L.; Siemiginowska, Aneta [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Van Dyk, David A.; Xu Jin [Department of Statistics, University of California, Irvine, CA 92697-1250 (United States); Connors, Alanna [Eureka Scientific, 2452 Delmer Street, Suite 100, Oakland, CA 94602-3017 (United States); Freeman, Peter E. [Department of Statistics, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 (United States); Zezas, Andreas, E-mail: vkashyap@cfa.harvard.ed, E-mail: asiemiginowska@cfa.harvard.ed, E-mail: dvd@ics.uci.ed, E-mail: jinx@ics.uci.ed, E-mail: aconnors@eurekabayes.co, E-mail: pfreeman@cmu.ed, E-mail: azezas@cfa.harvard.ed [Physics Department, University of Crete, P.O. Box 2208, GR-710 03, Heraklion, Crete (Greece)
2010-08-10
A common problem in astrophysics is determining how bright a source could be and still not be detected in an observation. Despite the simplicity with which the problem can be stated, the solution involves complicated statistical issues that require careful analysis. In contrast to the more familiar confidence bound, this concept has never been formally analyzed, leading to a great variety of often ad hoc solutions. Here we formulate and describe the problem in a self-consistent manner. Detection significance is usually defined by the acceptable proportion of false positives (background fluctuations that are claimed as detections, or Type I error), and we invoke the complementary concept of false negatives (real sources that go undetected, or Type II error), based on the statistical power of a test, to compute an upper limit to the detectable source intensity. To determine the minimum intensity that a source must have for it to be detected, we first define a detection threshold and then compute the probabilities of detecting sources of various intensities at the given threshold. The intensity that corresponds to the specified Type II error probability defines that minimum intensity and is identified as the upper limit. Thus, an upper limit is a characteristic of the detection procedure rather than the strength of any particular source. It should not be confused with confidence intervals or other estimates of source intensity. This is particularly important given the large number of catalogs that are being generated from increasingly sensitive surveys. We discuss, with examples, the differences between these upper limits and confidence bounds. Both measures are useful quantities that should be reported in order to extract the most science from catalogs, though they answer different statistical questions: an upper bound describes an inference range on the source intensity, while an upper limit calibrates the detection process. We provide a recipe for computing upper limits that applies to all detection algorithms.
CMS Collaboration
2015-04-14
A search has been performed for long-lived particles that could have come to rest within the CMS detector, using the time intervals between LHC beam crossings. The existence of such particles could be deduced from observation of their decays via energy deposits in the CMS calorimeter appearing at times that are well separated from any proton-proton collisions. Using a data set corresponding to an integrated luminosity of 18.6 inverse femtobarns of 8 TeV proton-proton collisions, and a search interval corresponding to 281 hours of trigger livetime, 10 events are observed, with a background prediction of 13.2 +3.6 -2.5 events. Limits are presented at 95% confidence level on gluino and top squark production, for over 13 orders of magnitude in the mean proper lifetime of the stopped particle. Assuming a cloud model of R-hadron interactions, a gluino with mass < 1000 GeV and a top squark with mass < 525 GeV are excluded, for lifetimes between 1 microsecond and 1000 s. These results are the most stringent constraints on stopped particles to date.
Search for decays of stopped long-lived particles produced in proton–proton collisions at ?s = 8 TeV
Khachatryan, V.
2015-04-11
A search has been performed for long-lived particles that could have come to rest within the CMS detector, using the time intervals between LHC beam crossings. The existence of such particles could be deduced from observation of their decays via energy deposits in the CMS calorimeter appearing at times that are well separated from any proton–proton collisions. Using a data set corresponding to an integrated luminosity of 18.6fb?¹ of 8TeV proton–proton collisions, and a search interval corresponding to 281 h of trigger livetime, 10 events are observed, with a background prediction of 13.2^{+3.6}_{–2.5} events. Limits are presented at 95 % confidence level on gluino and top squark production, for over 13 orders of magnitude in the mean proper lifetime of the stopped particle. Assuming a cloud model of R-hadron interactions, a gluino with mass ?1000GeV and a top squark with mass ?525GeV are excluded, for lifetimes between 1 ?s and 1000s. These results are the most stringent constraints on stopped particles to date.
Search for decays of stopped long-lived particles produced in proton–proton collisions at ?s = 8 TeV
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Khachatryan, V.
2015-04-11
A search has been performed for long-lived particles that could have come to rest within the CMS detector, using the time intervals between LHC beam crossings. The existence of such particles could be deduced from observation of their decays via energy deposits in the CMS calorimeter appearing at times that are well separated from any proton–proton collisions. Using a data set corresponding to an integrated luminosity of 18.6fb?¹ of 8TeV proton–proton collisions, and a search interval corresponding to 281 h of trigger livetime, 10 events are observed, with a background prediction of 13.2+3.6–2.5 events. Limits are presented at 95 %more »confidence level on gluino and top squark production, for over 13 orders of magnitude in the mean proper lifetime of the stopped particle. Assuming a cloud model of R-hadron interactions, a gluino with mass ?1000GeV and a top squark with mass ?525GeV are excluded, for lifetimes between 1 ?s and 1000s. These results are the most stringent constraints on stopped particles to date.« less
Chen, Chien Peter [Department of Radiation Oncology, Scripps Clinic, San Diego, California (United States)] [Department of Radiation Oncology, Scripps Clinic, San Diego, California (United States); Weinberg, Vivian [Comprehensive Cancer Center Biostatistics Core, University of California—San Francisco, San Francisco, California (United States)] [Comprehensive Cancer Center Biostatistics Core, University of California—San Francisco, San Francisco, California (United States); Shinohara, Katsuto [Department of Urology, University of California—San Francisco, San Francisco, California (United States)] [Department of Urology, University of California—San Francisco, San Francisco, California (United States); Roach, Mack; Nash, Marc; Gottschalk, Alexander; Chang, Albert J. [Department of Radiation Oncology, University of California—San Francisco, San Francisco, California (United States)] [Department of Radiation Oncology, University of California—San Francisco, San Francisco, California (United States); Hsu, I-Chow, E-mail: IHsu@radonc.ucsf.edu [Department of Radiation Oncology, University of California—San Francisco, San Francisco, California (United States)] [Department of Radiation Oncology, University of California—San Francisco, San Francisco, California (United States)
2013-06-01
Purpose: Evaluate efficacy and toxicity of salvage high-dose-rate brachytherapy (HDRB) for locally recurrent prostate cancer after definitive radiation therapy (RT). Methods and Materials: We retrospectively analyzed 52 consecutively accrued patients undergoing salvage HDRB between 1998 and 2009 for locally recurrent prostate cancer after previous definitive RT. After pathologic confirmation of locally recurrent disease, patients received 36 Gy in 6 fractions. Twenty-four patients received neoadjuvant hormonal therapy before salvage, and no patients received adjuvant hormonal therapy. Determination of biochemical failure after salvage HDRB was based on the Phoenix definition. Overall survival (OS) and bF distributions were calculated using the Kaplan-Meier method. Univariate analyses were performed to identify predictors of biochemical control. Acute and late genitourinary (GU) and gastrointestinal (GI) toxicities, based on Common Terminology Criteria for Adverse Events (version 4), were documented. Results: Median follow-up after salvage HDRB was 59.6 months. The 5-year OS estimate was 92% (95% confidence interval [CI]: 80%-97%) with median survival not yet reached. Five-year biochemical control after salvage was 51% (95% CI: 34%-66%). Median PSA nadir postsalvage was 0.1 (range: 0-7.2) reached at a median of 10.2 months after completing HDRB. As for complications, acute and late grade 3 GU toxicities were observed in only 2% and 2%, respectively. No grade 2 or higher acute GI events and 4% grade 2 GI late events were observed. On univariate analysis, disease-free interval after initial definitive RT (P=.07), percent of positive cores at the time of diagnosis (P=.08), interval from first recurrence to salvage HDRB (P=.09), and pre-HDRB prostate-specific antigen (P=.07) were each of borderline significance in predicting biochemical control after salvage HDRB. Conclusions: Prostate HDRB is an effective salvage modality with relatively few long-term toxicities. We provide potential predictors of biochemical control for prostate salvage HDRB.
Unilateral and Bilateral Breast Cancer in Women Surviving Pediatric Hodgkin's Disease
Basu, Swati K. [Department of Community and Preventive Medicine, James P. Wilmot Cancer Center at University of Rochester, Rochester, NY (United States); Schwartz, Cindy [Department of Hematology-Oncology, The Johns Hopkins Hospital, Baltimore, MD (United States); Fisher, Susan G. [Department of Community and Preventive Medicine, James P. Wilmot Cancer Center at University of Rochester, Rochester, NY (United States); Hudson, Melissa M. [Department of Hematology-Oncology, St. Jude Children's Research Hospital, Memphis, TN (United States); Tarbell, Nancy [Department of Pediatric Radiation Oncology, Massachusetts General Hospital, Boston, MA (United States); Muhs, Ann [Department of Radiation Oncology, James P. Wilmot Cancer Center at University of Rochester, Rochester, NY (United States); Marcus, Karen J. [Department of Radiation Oncology, Brigham and Women's Hospital, Boston, MA (United States); Mendenhall, Nancy [Department of Radiation Oncology, University of Florida Medical Center, Gainesville, FL (United States); Mauch, Peter [Department of Radiation Oncology, Brigham and Women's Hospital, Boston, MA (United States); Kun, Larry E. [Department of Radiological Sciences, St. Jude Children's Research Hospital, Memphis, TN (United States); Constine, Louis S. [Department of Radiation Oncology, James P. Wilmot Cancer Center at University of Rochester, Rochester, NY (United States); Department of Pediatrics, James P. Wilmot Cancer Center at University of Rochester, Rochester, NY (United States)], E-mail: louis_constine@urmc.rochester.edu
2008-09-01
Purpose: To define demographic and therapeutic associations with the risk of breast cancer in children treated for Hodgkin's disease (HD), particularly the frequency and interval to the development of contralateral breast cancer. Methods and Materials: All 398 female patients (<19 years) treated for HD in five institutions during the accrual period were evaluated. Mean follow-up was 16.9 years. The standardized incidence ratio (SIR) was calculated as the ratio of the observed number of cases to the expected number of cases, estimated using age-matched controls from the Surveillance, Epidemiology, and End Results database. Results: A total of 29 women developed breast cancer (25 invasive, 4 ductal carcinoma in situ; SIR, 37.25; 95% confidence interval, 24.96-53.64). Time to diagnosis was 9.4 to 36.1 years. Cumulative incidence was 24% at 30 years. Ten patients (34%) had bilateral disease (9 metachronous, 1 synchronous). The interval to contralateral breast cancer was 12 to 34 months. On univariate analysis, significant variables included stage of HD, mantle radiation dose, pelvic radiation (protective), and follow-up time. On multivariate analysis, early stage and older age at diagnosis of HD ({<=}12 vs. >12 years) were significant predictors of secondary breast cancer. Conclusions: Women surviving pediatric HD were found to have a 37-fold increase in the risk of breast cancer and a high likelihood of rapidly developing bilateral disease. Early-stage HD and age greater than 12 years at diagnosis of HD were independent risk factors. Higher radiation doses may augment risk, and pelvic radiation may be protective. Breast cancer screening methodology and frequency, plus the role of prophylaxis in patients with unilateral disease, require definition.
Goda, Jayant S.; Massey, Christine; Kuruvilla, John; Gospodarowicz, Mary K.; Wells, Woodrow; Hodgson, David C.; Sun, Alexander; Keating, Armand; Crump, Michael; Tsang, Richard W.
2012-11-01
Purpose: To analyze, through chart review, the efficacy of salvage radiation therapy (sRT) for relapsed or progressive Hodgkin lymphoma (HL) patients who failed autologous stem cell transplant (ASCT). Patients and Methods: Among 347 patients with recurrent/refractory HL who received ASCT from 1986-2006, 163 had post-ASCT progression or relapse. Of these, 56 received sRT and form the basis of this report. Median age at sRT was 30 years (range, 17-59 years). Disease was confined to lymph nodes in 27 patients, whereas 24 had both nodal and extranodal disease. Salvage radiation therapy alone was given in 34 patients (61%), and sRT plus chemotherapy was given in 22 (39%). Median interval from ASCT to sRT was 0.8 years (range, 0.1-5.6 years). The median dose was 35 Gy (range, 8-40.3 Gy). The sRT technique was extended-field in 14 patients (25%) and involved-field in 42 (75%). Results: The median follow-up from sRT was 31.3 months (range, 0.2-205.5 months). Overall response rate was 84% (complete response: 36%; partial response: 48%). The median overall survival was 40.8 months (95% confidence interval, 34.2-56.3 months). The 5-year overall survival was 29% (95% confidence interval, 14%-44%). The 2-year progression-free survival (PFS) was 16%; the 2-year local PFS was 65%, whereas the 2-year systemic PFS was 17%. The 1-year PFS was higher in patients in whom all diseased sites were irradiated (49%) compared with those in whom only the symptomatic site was treated (22%, P=.07). Among 20 alive patients, 5 were disease free (at 6.4, 6.8, 7.4, 7.9, and 17.1 years). Conclusion: For patients with HL who fail ASCT, a selective use of RT provides a durable local control rate of 65% at 2 years and should be considered as part of the standard management plan for the palliation of incurable HL. Occasionally irradiation of truly localized disease can lead to long-term survival.
2011-01-01
with 60 min interval group and the seven subfractions withwith 30 min interval group, the seven subfractions with 5
Monthly energy review, March 1998
1998-03-01
The Monthly Energy Review (MER) presents an overview of the Energy Information Administration`s recent monthly energy statistics. The statistics cover the major activities of U.S. production, consumption, trade, stocks, and prices for petroleum, natural gas, coal, electricity, and nuclear energy. Also included are international energy and thermal and metric conversion factors. Energy production during December 1997 totaled 5.9 quadrillion Btu, a 2.8 percent increase from the level of production during December 1996. Coal production increased 9.5 percent, natural gas production increased 3.9 percent, and production of crude oil and natural gas plant liquids decreased 1.1 percent. All other forms of energy production combined were down 6.9 percent from the level of production during December 1996.
Use of bimodal carbon distribution in compacts for producing metallic iron nodules
Iwasaki, Iwao
2012-10-16
A method for use in production of metallic iron nodules comprising providing a reducible mixture into a hearth furnace for the production of metallic iron nodules, where the reducible mixture comprises a quantity of reducible iron bearing material, a quantity of first carbonaceous reducing material of a size less than about 28 mesh of an amount between about 65 percent and about 95 percent of a stoichiometric amount necessary for complete iron reduction of the reducible iron bearing material, and a quantity of second carbonaceous reducing material with an average particle size greater than average particle size of the first carbonaceous reducing material and a size between about 3 mesh and about 48 mesh of an amount between about 20 percent and about 60 percent of a stoichiometric amount of necessary for complete iron reduction of the reducible iron bearing material.
Nordine, P.C.
1983-08-01
A maximum F-atom yield from F2 occurs in a combustion driven hydrogen fluoride supersonic diffusion laser (HFSDL) because the amount of fluorine reacted with hydrogen (or deuterium) continues to increase with temperature after most of the unreacted fluorine has been thermally dissociated. A small decease from the maximum combustor F-atom yield allows a significant decease in the required temperature and in the corrosion rates that uncooled laser nozzles would display. The temperatures that give F-atom yields equal to 95 percent of the maximum values were calculated for typical HFSDL combustor pressures and F-atom mole fractions and the corrosion rates of uncooled nozzles were evaluated at these temperatures. The corrosion rates of materials resistant to fluorine attack at the highest temperatures would allow HFSDL applications or test experiments up to several hours duration.
Intern experience at the Conoco VCM plant: an internship report
Hall, James Josiah
2013-03-13
of all future m ea s ur e m e n t s to fall within plus or minus 3.8 of 5.0. In o th e r words, the measu re m en t technique shows the c o n c e n t r a t i o n to be 5.0 ppm + 3.8 ppm with a 95 percent confi dence 1e v e l . I n t e r p r e t a... stream_source_info 4027076.pdf.txt stream_content_type text/plain stream_size 70257 Content-Encoding ISO-8859-1 stream_name 4027076.pdf.txt Content-Type text/plain; charset=ISO-8859-1 INTERN EXPERTENCE AT T i!E CO N O C...
Becker, N.M.; Vanta, E.B.
1995-05-01
Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980`s at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more than 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments.
Use of bimodal carbon distribution in compacts for producing metallic iron nodules
Iwasaki, Iwao
2014-04-08
A method for use in production of metallic iron nodules comprising providing a reducible mixture into a hearth furnace for the production of metallic iron nodules, where the reducible mixture comprises a quantity of reducible iron bearing material, a quantity of first carbonaceous reducing material of a size less than about 28 mesh of an amount between about 65 percent and about 95 percent of a stoichiometric amount necessary for complete iron reduction of the reducible iron bearing material, and a quantity of second carbonaceous reducing material with an average particle size greater than average particle size of the first carbonaceous reducing material and a size between about 3 mesh and about 48 mesh of an amount between about 20 percent and about 60 percent of a stoichiometric amount of necessary for complete iron reduction of the reducible iron bearing material.
NONE
1996-02-01
A major objective of the coal-fired high performance power systems (HIPPS) program is to achieve significant increases in the thermodynamic efficiency of coal use for electric power generation. Through increased efficiency, all airborne emissions can be decreased, including emissions of carbon dioxide. High Performance power systems as defined for this program are coal-fired, high efficiency systems where the combustion products from coal do not contact the gas turbine. Typically, this type of a system will involve some indirect heating of gas turbine inlet air and then topping combustion with a cleaner fuel. The topping combustion fuel can be natural gas or another relatively clean fuel. Fuel gas derived from coal is an acceptable fuel for the topping combustion. The ultimate goal for HIPPS is to, have a system that has 95 percent of its heat input from coal. Interim systems that have at least 65 percent heat input from coal are acceptable, but these systems are required to have a clear development path to a system that is 95 percent coal-fired. A three phase program has been planned for the development of HIPPS. Phase 1, reported herein, includes the development of a conceptual design for a commercial plant. Technical and economic feasibility have been analysed for this plant. Preliminary R&D on some aspects of the system were also done in Phase 1, and a Research, Development and Test plan was developed for Phase 2. Work in Phase 2 include s the testing and analysis that is required to develop the technology base for a prototype plant. This work includes pilot plant testing at a scale of around 50 MMBtu/hr heat input. The culmination of the Phase 2 effort will be a site-specific design and test plan for a prototype plant. Phase 3 is the construction and testing of this plant.
Advanced Techniques for Power System Identification from Measured Data
Pierre, John W.; Wies, Richard; Trudnowski, Daniel
2008-11-25
Time-synchronized measurements provide rich information for estimating a power-system's electromechanical modal properties via advanced signal processing. This information is becoming critical for the improved operational reliability of interconnected grids. A given mode's properties are described by its frequency, damping, and shape. Modal frequencies and damping are useful indicators of power-system stress, usually declining with increased load or reduced grid capacity. Mode shape provides critical information for operational control actions. This project investigated many advanced techniques for power system identification from measured data focusing on mode frequency and damping ratio estimation. Investigators from the three universities coordinated their effort with Pacific Northwest National Laboratory (PNNL). Significant progress was made on developing appropriate techniques for system identification with confidence intervals and testing those techniques on field measured data and through simulation. Experimental data from the western area power system was provided by PNNL and Bonneville Power Administration (BPA) for both ambient conditions and for signal injection tests. Three large-scale tests were conducted for the western area in 2005 and 2006. Measured field PMU (Phasor Measurement Unit) data was provided to the three universities. A 19-machine simulation model was enhanced for testing the system identification algorithms. Extensive simulations were run with this model to test the performance of the algorithms. University of Wyoming researchers participated in four primary activities: (1) Block and adaptive processing techniques for mode estimation from ambient signals and probing signals, (2) confidence interval estimation, (3) probing signal design and injection method analysis, and (4) performance assessment and validation from simulated and field measured data. Subspace based methods have been use to improve previous results from block processing techniques. Bootstrap techniques have been developed to estimate confidence intervals for the electromechanical modes from field measured data. Results were obtained using injected signal data provided by BPA. A new probing signal was designed that puts more strength into the signal for a given maximum peak to peak swing. Further simulations were conducted on a model based on measured data and with the modifications of the 19-machine simulation model. Montana Tech researchers participated in two primary activities: (1) continued development of the 19-machine simulation test system to include a DC line; and (2) extensive simulation analysis of the various system identification algorithms and bootstrap techniques using the 19 machine model. Researchers at the University of Alaska-Fairbanks focused on the development and testing of adaptive filter algorithms for mode estimation using data generated from simulation models and on data provided in collaboration with BPA and PNNL. There efforts consist of pre-processing field data, testing and refining adaptive filter techniques (specifically the Least Mean Squares (LMS), the Adaptive Step-size LMS (ASLMS), and Error Tracking (ET) algorithms). They also improved convergence of the adaptive algorithms by using an initial estimate from block processing AR method to initialize the weight vector for LMS. Extensive testing was performed on simulated data from the 19 machine model. This project was also extensively involved in the WECC (Western Electricity Coordinating Council) system wide tests carried out in 2005 and 2006. These tests involved injecting known probing signals into the western power grid. One of the primary goals of these tests was the reliable estimation of electromechanical mode properties from measured PMU data. Applied to the system were three types of probing inputs: (1) activation of the Chief Joseph Dynamic Brake, (2) mid-level probing at the Pacific DC Intertie (PDCI), and (3) low-level probing on the PDCI. The Chief Joseph Dynamic Brake is a 1400 MW disturbance to the system and is injected for a ha
Kucera, Paul A.
2009-06-26
Chinook salmon in the Snake River basin were listed as threatened under the Endangered Species Act in 1992 (NMFS 1992). The Secesh River represents the only stream in the Snake River basin where natural origin (wild) salmon escapement monitoring occurs at the population level, absent a supplementation program. As such the Secesh River has been identified as a long term salmon escapement and productivity monitoring site by the Nez Perce Tribe Department of Fisheries Resources Management. Salmon managers will use this data for effective population management and evaluation of the effect of conservation actions on a natural origin salmon population. The Secesh River also acts as a reference stream for supplementation program comparison. Dual frequency identification sonar (DIDSON) was used to determine adult spring and summer Chinook salmon escapement in the Secesh River in 2008. DIDSON technology was selected because it provided a non-invasive method for escapement monitoring that avoided listed species trapping and handling incidental mortality, and fish impedance related concerns. The DIDSON monitoring site was operated continuously from June 13 to September 14. The first salmon passage was observed on July 3. DIDSON site total estimated salmon escapement, natural and hatchery fish, was 888 fish {+-} 65 fish (95% confidence interval). Coefficient of variation associated with the escapement estimate was 3.7%. The DIDSON unit was operational 98.1% of the salmon migration period. Adult salmon migration timing in the Secesh River occurred over 74 days from July 3 to September 14, with 5,262 total fish passages observed. The spawning migration had 10%, median, and 90% passage dates of July 8, July 16, and August 12, respectively. The maximum number of net upstream migrating salmon was above the DIDSON monitoring site on August 27. Validation monitoring of DIDSON target counts with underwater optical cameras occurred for species identification. A total of 860 optical camera identified salmon passage observations were identical to DIDSON target counts. However, optical cameras identified eight jack salmon (3 upstream, 5 downstream) less than 55 cm in length that DIDSON did not count as salmon because of the length criteria employed ({ge} 55 cm). Precision of the DIDSON technology was evaluated by comparing estimated net upstream salmon escapement and associated 95% confidence intervals between two DIDSON sonar units operated over a five day period. The DIDSON 1 salmon escapement was 145.7 fish ({+-} 2.3), and the DIDSON 2 escapement estimate was 150.5 fish ({+-} 5). The overlap in the 95% confidence intervals suggested that the two escapement estimates were not significantly different from each other. Known length salmon carcass trials were conducted in 2008 to examine the accuracy of manually measured lengths, obtained using DIDSON software, on high frequency files at a 5 m window length. Linear regression demonstrated a highly significant relationship between known lengths and manually measured salmon carcass lengths (p < 0.0001). A positive bias in manual length measurement of 6.8% to 8% existed among the two observers in the analysis. Total Secesh River salmon escapement (natural origin and hatchery) in 2008 was 912 fish. Natural origin salmon escapement in the entire Secesh River drainage was 847 fish. The estimated natural origin spawner abundance was 836 fish. Salmon spawner abundance in 2008 increased by three fold compared to 2007 abundance levels. The 10 year geometric mean natural origin spawner abundance was 538 salmon and was below the recommended viable population threshold level established by the ICTRT (2007). One additional Snake River basin salmon population was assessed for comparison of natural origin salmon spawner abundance. The Johnson Creek/EFSF Salmon River population had a 10 year geometric mean natural origin spawner abundance of 254 salmon. Salmon spawner abundance levels in both streams were below viable population thresholds. DIDSON technology has been used in the Secesh River to determine salmo
Primary Chinese Semantic-Phonetic Compounds Pronunciation Rules Mining and Visualization
confidence) (scatter plot) (graph-based visualization)(parallel coordinates plots) (double decker plot
US Relations with Mexico and Central America, 1977-1999
Rosenblum, Marc
2000-01-01
alliance for production” restored investor confidence. More importantly, by 1977 production based on vast new oil
The Belief Bias Effect Is Aptly Named: A Reply to Klauer and Kellen (2011)
Wixted, John T.
2011-01-01
) multinomial processing tree (MPT) model to confidence ratings (henceforth, MPTC) describes the data better
THE FERMI-GBM X-RAY BURST MONITOR: THERMONUCLEAR BURSTS FROM 4U 0614+09
Linares, M.; Chakrabarty, D. [Massachusetts Institute of Technology, Kavli Institute for Astrophysics and Space Research, Cambridge, MA 02139 (United States); Connaughton, V.; Bhat, P. N.; Briggs, M. S.; Preece, R. [CSPAR and Physics Department, University of Alabama in Huntsville, Huntsville, AL 35899 (United States); Jenke, P.; Kouveliotou, C.; Wilson-Hodge, C. A. [Space Science Office, VP62, NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States); Van der Horst, A. J. [Astronomical Institute 'Anton Pannekoek', University of Amsterdam, NL-1090-GE Amsterdam (Netherlands); Camero-Arranz, A.; Finger, M.; Paciesas, W. S. [Universities Space Research Association, Huntsville, AL 35805 (United States); Beklen, E. [Physics Department, Suleyman Demirel University, 32260 Isparta (Turkey); Von Kienlin, A. [Max Planck Institute for Extraterrestrial Physics, Giessenbachstrasse, Postfach 1312, D-85748 Garching (Germany)
2012-12-01
Thermonuclear bursts from slowly accreting neutron stars (NSs) have proven difficult to detect, yet they are potential probes of the thermal properties of the NS interior. During the first year of a systematic all-sky search for X-ray bursts using the Gamma-ray Burst Monitor aboard the Fermi Gamma-ray Space Telescope we have detected 15 thermonuclear bursts from the NS low-mass X-ray binary 4U 0614+09 when it was accreting at nearly 1% of the Eddington limit. We measured an average burst recurrence time of 12 {+-} 3 days (68% confidence interval) between 2010 March and 2011 March, classified all bursts as normal duration bursts and placed a lower limit on the recurrence time of long/intermediate bursts of 62 days (95% confidence level). We discuss how observations of thermonuclear bursts in the hard X-ray band compare to pointed soft X-ray observations and quantify such bandpass effects on measurements of burst radiated energy and duration. We put our results for 4U 0614+09 in the context of other bursters and briefly discuss the constraints on ignition models. Interestingly, we find that the burst energies in 4U 0614+09 are on average between those of normal duration bursts and those measured in long/intermediate bursts. Such a continuous distribution in burst energy provides a new observational link between normal and long/intermediate bursts. We suggest that the apparent bimodal distribution that defined normal and long/intermediate duration bursts during the last decade could be due to an observational bias toward detecting only the longest and most energetic bursts from slowly accreting NSs.
The Gemini NICI planet-finding campaign: the orbit of the young exoplanet ? Pictoris b
Nielsen, Eric L.; Liu, Michael C.; Chun, Mark; Ftaclas, Christ; Wahhaj, Zahed; Biller, Beth A.; Hayward, Thomas L.; Kuchner, Marc J.; Rodigas, Timothy J.; Toomey, Douglas W.
2014-10-20
We present new astrometry for the young (12-21 Myr) exoplanet ? Pictoris b taken with the Gemini/NICI and Magellan/MagAO instruments between 2009 and 2012. The high dynamic range of our observations allows us to measure the relative position of ? Pic b with respect to its primary star with greater accuracy than previous observations. Based on a Markov Chain Monte Carlo analysis, we find the planet has an orbital semi-major axis of 9.1{sub ?0.5}{sup +5.3} AU and orbital eccentricity <0.15 at 68% confidence (with 95% confidence intervals of 8.2-48 AU and 0.00-0.82 for semi-major axis and eccentricity, respectively, due to a long narrow degenerate tail between the two). We find that the planet has reached its maximum projected elongation, enabling higher precision determination of the orbital parameters than previously possible, and that the planet's projected separation is currently decreasing. With unsaturated data of the entire ? Pic system (primary star, planet, and disk) obtained thanks to NICI's semi-transparent focal plane mask, we are able to tightly constrain the relative orientation of the circumstellar components. We find the orbital plane of the planet lies between the inner and outer disks: the position angle (P.A.) of nodes for the planet's orbit (211.8 ± 0.°3) is 7.4? greater than the P.A. of the spine of the outer disk and 3.2? less than the warped inner disk P.A., indicating the disk is not collisionally relaxed. Finally, for the first time we are able to dynamically constrain the mass of the primary star ? Pic to 1.76{sub ?0.17}{sup +0.18} M {sub ?}.
none,
2013-08-15
Oak Ridge Associated Universities (ORAU), under the Oak Ridge Institute for Science and Education (ORISE) contract, collected split surface water samples with Nuclear Fuel Services (NFS) representatives on June 12, 2013. Representatives from the U.S. Nuclear Regulatory Commission (NRC) and the Tennessee Department of Environment and Conservation were also in attendance. Samples were collected at four surface water stations, as required in the approved Request for Technical Assistance number 11-018. These stations included Nolichucky River upstream (NRU), Nolichucky River downstream (NRD), Martin Creek upstream (MCU), and Martin Creek downstream (MCD). Both ORAU and NFS performed gross alpha and gross beta analyses, and Table 1 presents the comparison of results using the duplicate error ratio (DER), also known as the normalized absolute difference. A DER ≤ 3 indicates at a 99% confidence interval that split sample results do not differ significantly when compared to their respective one standard deviation (sigma) uncertainty (ANSI N42.22). The NFS split sample report specifies 95% confidence level of reported uncertainties (NFS 2013). Therefore, standard two sigma reporting values were divided by 1.96. In conclusion, most DER values were less than 3 and results are consistent with low (e.g., background) concentrations. The gross beta result for sample 5198W0014 was the exception. The ORAU gross beta result of 6.30 ± 0.65 pCi/L from location NRD is well above NFS's non-detected result of 1.56 ± 0.59 pCi/L. NFS's data package includes no detected result for any radionuclide at location NRD. At NRC's request, ORAU performed gamma spectroscopic analysis of sample 5198W0014 to identify analytes contributing to the relatively elevated gross beta results. This analysis identified detected amounts of naturally-occurring constituents, most notably Ac-228 from the thorium decay series, and does not suggest the presence of site-related contamination.
Measurement of CP observables in B+- --> D_CP K+- decays and constraints on the CKM angle gamma
The BABAR Collaboration; P. del Amo Sanchez
2011-02-15
Using the entire sample of 467 million Y(4S) --> BBbar decays collected with the BaBar detector at the PEP-II asymmetric-energy B factory at SLAC, we perform a "GLW" analysis of B+- --> D K+- decays, using decay modes in which the neutral D meson decays to either CP-eigenstates or non-CP-eigenstates. We measure the partial decay rate charge asymmetries for CP-even and CP-odd D final states to be A_CP+ = 0.25+-0.06+-0.02 and A_CP- = -0.09+-0.07+-0.02, respectively, where the first error is the statistical and the second is the systematic uncertainty. The parameter A_CP+ is different from zero with a significance of 3.6 standard deviations, constituting evidence for direct CP violation. We also measure the ratios of the charged-averaged B partial decay rates in CP and non-CP decays, R_CP+ = 1.18+-0.09+-0.05 and R_CP- = 1.07+-0.08+-0.04. We infer frequentist confidence intervals for the angle gamma of the (db) unitarity triangle, for the strong phase difference delta_B, and for the amplitude ratio r_B, which are related to the B- --> DK- decay amplitude by r_Be^{i(delta_B-gamma)} = A(B- --> anti-D0 K^-)/A(B- --> D0 K-). Including statistical and systematic uncertainties, we obtain 0.24
Cuffey, R.J. ); Pachut, J.F. )
1990-12-01
The Holocene reef-building coral Favia pallida was sampled at 4.5 m depth increments (to 40 m) from two reefs on Enewetak Atoll to examine intraspecific environmental effects. An exposed outer reef was massive and wall-like, whereas a sheltered lagoonal reef grew as a slender pinnacle. Corallite diameter and growth rate, two attributes retrievable in fossil corals, were measured with data partitioned into shallow (<20 m), intermediate (20 to 29 m), and deep-water (>29 m) subsets. Highly significant differences between depth zone populations were found for both corallite diameters and growth rates in analyses of individual and combined reef data sets. Canonical variates analyses (CVA) separated populations from depth zones along single, highly significant, functions. Centroids and 95% confidence intervals, calculated from CVA scores of colonies in each population, are widely separated for the lagoon reef and combined data sets. Conversely, populations from shallow and intermediate depths on the outer reef display overlapping confidence bars indicative of more gradational morphologic changes. When CV's were used to classify specimens to groups, misassignments of intermediate depth specimens to shallow or deep-water populations underscored the gradational nature of the environment. Completely intergrading populations of Favia pallida collected from different depths can be morphologically separated into statistically distinct groupings. A stratigraphic succession of such morphotypes might be interpreted as abruptly appearing separate species if sampling were not as uniform, systematic, and detailed as was possible on modern reefs. Analyses of evolutionary patterns must carefully assess potential effects of clinal variation if past evolutionary patterns are to be interpreted correctly.
Measurement of CP violation observables and parameters for the decays $B^{\\pm}\\to DK^{*\\pm}$
Aubert, Bernard; Karyotakis, Y.; Lees, J.P.; Poireau, V.; Prencipe, E.; Prudent, X.; Tisserand, V.; /Annecy, LAPP; Garra Tico, J.; Grauges, E.; /Barcelona U., ECM; Martinelli, M.; Palano, A.; Pappagallo, M.; /INFN, Bari /Bari U.; Eigen, G.; Stugu, B.; Sun, L.; /Bergen U.; Battaglia, M.; Brown, D.N.; Kerth, L.T.; Kolomensky, Yu.G.; Lynch, G.; Osipenkov, I.L.; /UC, Berkeley /Birmingham U. /Ruhr U., Bochum /British Columbia U. /Brunel U. /Novosibirsk, IYF /UC, Irvine /UC, Riverside /UC, San Diego /UC, Santa Barbara /UC, Santa Cruz /Caltech /Cincinnati U. /Colorado U. /Colorado State U. /Dortmund U. /Dresden, Tech. U. /Ecole Polytechnique /Edinburgh U. /INFN, Ferrara /Ferrara U. /INFN, Ferrara /INFN, Ferrara /Ferrara U. /INFN, Ferrara /INFN, Ferrara /Ferrara U. /Frascati /INFN, Genoa /Genoa U. /INFN, Genoa /INFN, Genoa /Genoa U. /INFN, Genoa /INFN, Genoa /Genoa U. /Harvard U. /Heidelberg U. /Humboldt U., Berlin /Imperial Coll., London /Iowa State U. /Iowa State U. /Johns Hopkins U. /Orsay, LAL /LLNL, Livermore /Liverpool U. /Queen Mary, U. of London /Royal Holloway, U. of London /Louisville U. /Mainz U., Inst. Kernphys. /Manchester U. /Maryland U. /Massachusetts U., Amherst /MIT /McGill U. /INFN, Milan /Milan U. /INFN, Milan /INFN, Milan /Milan U. /Mississippi U. /Montreal U. /Mt. Holyoke Coll. /INFN, Naples /Naples U. /INFN, Naples /INFN, Naples /Naples U. /NIKHEF, Amsterdam /NIKHEF, Amsterdam /Notre Dame U. /Ohio State U. /Oregon U. /INFN, Padua /Padua U. /INFN, Padua /INFN, Padua /Padua U. /Paris U., VI-VII /Pennsylvania U. /INFN, Perugia /Perugia U. /INFN, Pisa /Pisa U. /INFN, Pisa /Pisa, Scuola Normale Superiore /INFN, Pisa /Pisa U. /INFN, Pisa /Princeton U. /INFN, Rome /INFN, Rome /Rome U. /INFN, Rome /INFN, Rome /Rome U. /INFN, Rome /INFN, Rome /Rome U. /INFN, Rome /INFN, Rome /Rome U. /INFN, Rome /Rostock U. /Rutherford /DAPNIA, Saclay /SLAC /South Carolina U. /Stanford U., Phys. Dept. /SUNY, Albany /Tel Aviv U. /Tennessee U. /Texas U. /Texas U., Dallas /INFN, Turin /Turin U. /INFN, Trieste /Trieste U. /Valencia U. /Victoria U. /Warwick U. /Wisconsin U., Madison
2010-08-26
We study the decay B{sup -} {yields} DK*{sup -} using a sample of 379 x 10{sup 6} {Upsilon}(4S) {yields} B{bar B} events collected with the BABAR detector at the PEP-II B-factory. We perform a 'GLW' analysis where the D meson decays into either a CP-even (CP+) eigenstate (K{sup +}K{sup -}, {pi}{sup +}{pi}{sup -}), CP-odd (CP-) eigenstate (K{sub S}{sup 0}{pi}{sup 0}, K{sub S}{sup 0}{phi}, K{sub S}{sup 0}{omega}) or a non-CP state (K{sup -}{pi}{sup +}). We also analyze D meson decays into K{sup +}{pi}{sup -} from a Cabibbo-favored {bar D}{sup 0} decay or doubly suppressed D{sup 0} decay ('ADS' analysis). We measure observables that are sensitive to the CKM angle {gamma}: the partial-rate charge asymmetries A{sub CP{+-}}, the ratios R{sub CP{+-}} of the B-decay branching fractions in CP{+-} and non-CP decay, the ratio R{sub ADS} of the charge-averaged branching fractions, and the charge asymmetry A{sub ADS} of the ADS decays: A{sub CP+} = 0.09 {+-} 0.13 {+-} 0.06, A{sub CP-} = -0.23 {+-} 0.21 {+-} 0.07, R{sub CP+} = 2.17 {+-} 0.35 {+-} 0.09, R{sub CP-} = 1.03 {+-} 0.27 {+-} 0.13, R{sub ADS} = 0.066 {+-} 0.031 {+-} 0.010, and A{sub ADS} = -0.34 {+-} 0.43 {+-} 0.16, where the first uncertainty is statistical and the second is systematic. Combining all the measurements and using a frequentist approach yields the magnitude of the ratio between the Cabibbo-suppressed and favored amplitudes, r{sub B} = 0.31 with a one (two) sigma confidence level interval of [0.24, 0.38] ([0.17, 0.43]). The value r{sub B} = 0 is excluded at the 3.3 sigma level. A similar analysis excludes values of {gamma} in the intervals [0, 7]{sup o}, [55, 111]{sup o}, and [175, 180]{sup o} ([85, 99]{sup o}) at the one (two) sigma confidence level.
Treatment of Five or More Brain Metastases With Stereotactic Radiosurgery
Hunter, Grant K.; Suh, John H.; Reuther, Alwyn M.; Vogelbaum, Michael A.; Barnett, Gene H.; Angelov, Lilyana; Weil, Robert J.; Neyman, Gennady; Chao, Samuel T.
2012-08-01
Purpose: To examine the outcomes of patients with five or more brain metastases treated in a single session with stereotactic radiosurgery (SRS). Methods and Materials: Sixty-four patients with brain metastases treated with SRS to five or more lesions in a single session were reviewed. Primary disease type, number of lesions, Karnofsky performance score (KPS) at SRS, and status of primary and systemic disease at SRS were included. Patients were treated using dosing as defined by Radiation Therapy Oncology Group Protocol 90-05, with adjustments for critical structures. We defined prior whole-brain radiotherapy (WBRT) as WBRT completed >1 month before SRS and concurrent WBRT as WBRT completed within 1 month before or after SRS. Kaplan-Meier estimates and Cox proportional hazard regression were used to determine which patient and treatment factors predicted overall survival (OS). Results: The median OS after SRS was 7.5 months. The median KPS was 80 (range, 60-100). A KPS of {>=}80 significantly influenced OS (median OS, 4.8 months for KPS {<=}70 vs. 8.8 months for KPS {>=}80, p = 0.0097). The number of lesions treated did not significantly influence OS (median OS, 6.6 months for eight or fewer lesions vs. 9.9 months for more than eight, p = nonsignificant). Primary site histology did not significantly influence median OS. On multivariate Cox modeling, KPS and prior WBRT significantly predicted for OS. Whole-brain radiotherapy before SRS compared with concurrent WBRT significantly influenced survival, with a risk ratio of 0.423 (95% confidence interval 0.191-0.936, p = 0.0338). No significant differences were observed when no WBRT was compared with concurrent WBRT or when the no WBRT group was compared with prior WBRT. A KPS of {<=}70 predicted for poorer outcomes, with a risk ratio of 2.164 (95% confidence interval 1.157-4.049, p = 0.0157). Conclusions: Stereotactic radiosurgery to five or more brain lesions is an effective treatment option for patients with metastatic cancer, especially for patients previously treated with WBRT. A KPS of {>=}80 predicts for an improved outcome.
Chen, Helen H.W. [Department of Radiation Oncology, National Cheng Kung University, Medical College and Hospital, Tainan, Taiwan (China); Institute of Clinical Medicine, College of Medicine, National Cheng Kung University, Tainan, Taiwan (China); Chou, Cheng-Yang [Department of Obstetrics and Gynecology, National Cheng Kung University, Medical College and Hospital, Tainan, Taiwan (China); Wu, Yuan-Hua; Hsueh, Wei-Ting; Hsu, Chiung-Hui [Department of Radiation Oncology, National Cheng Kung University, Medical College and Hospital, Tainan, Taiwan (China); Guo, How-Ran [Department of Environmental and Occupational Health, National Cheng Kung University, Medical College and Hospital, Tainan, Taiwan (China); Lee, Wen-Ying, E-mail: 7707@so-net.net.tw [Department of Pathology, Chi Mei Medical Center, Tainan, Taiwan (China) and Department of Pathology, College of Medicine, Taipei Medical University, Taipei, Taiwan (China); Su, Wu-Chou, E-mail: sunnysu@mail.ncku.edu.tw [Department of Internal Medicine, National Cheng Kung University, Medical College and Hospital, Tainan, Taiwan (China)
2012-02-01
Purpose: Constitutively activated signal transducers and activators of transcription (STAT) factors, in particular STAT1, STAT3, and STAT5, have been detected in a wide variety of human primary tumors and have been demonstrated to directly contribute to oncogenesis. However, the expression pattern of these STATs in cervical carcinoma is still unknown, as is whether or not they have prognostic significance. This study investigated the expression patterns of STAT1, STAT3, and STAT5 in cervical cancer and their associations with clinical outcomes in patients treated with radical radiation therapy. Methods and Materials: A total of 165 consecutive patients with International Federation of Gynecology and Obstetrics (FIGO) Stages IB to IVA cervical cancer underwent radical radiation therapy, including external beam and/or high-dose-rate brachytherapy between 1989 and 2002. Immunohistochemical studies of their formalin-fixed, paraffin-embedded tissues were performed. Univariate and multivariate analyses were performed to identify and to evaluate the effects of these factors affecting patient survival. Results: Constitutive activations of STAT1, STAT3, and STAT5 were observed in 11%, 22%, and 61% of the participants, respectively. While STAT5 activation was associated with significantly better metastasis-free survival (p < 0.01) and overall survival (p = 0.04), STAT1 and STAT3 activation were not. Multivariate analyses showed that STAT5 activation, bulky tumor ({>=}4 cm), advanced stage (FIGO Stages III and IV), and brachytherapy (yes vs. no) were independent prognostic factors for cause-specific overall survival. None of the STATs was associated with local relapse. STAT5 activation (odds ratio = 0.29, 95% confidence interval = 0.13-0.63) and advanced stage (odds ratio = 2.54; 95% confidence interval = 1.03-6.26) were independent predictors of distant metastasis. Conclusions: This is the first report to provide the overall expression patterns and prognostic significance of specific STATs in cervical carcinoma. Our results indicate that constitutive STAT5 activation correlates with better metastasis-free survival and overall survival in cervical cancer patients who have received radiation therapy.
Huang, Shao Hui; O'Sullivan, Brian; Ringash, Jolie; Hope, Andrew; Gilbert, Ralph; Irish, Jonathan; Perez-Ordonez, Bayardo; Weinreb, Ilan; Waldron, John
2013-12-01
Purpose: To compare the temporal lymph node (LN) regression and regional control (RC) after primary chemoradiation therapy/radiation therapy in human papillomavirus-related [HPV(+)] versus human papillomavirus-unrelated [HPV(?)] head-and-neck cancer (HNC). Methods and Materials: All cases of N2-N3 HNC treated with radiation therapy/chemoradiation therapy between 2003 and 2009 were reviewed. Human papillomavirus status was ascertained by p16 staining on all available oropharyngeal cancers. Larynx/hypopharynx cancers were considered HPV(?). Initial radiologic complete nodal response (CR) (?1.0 cm 8-12 weeks after treatment), ultimate LN resolution, and RC were compared between HPV(+) and HPV(?) HNC. Multivariate analysis identified outcome predictors. Results: A total of 257 HPV(+) and 236 HPV(?) HNCs were identified. The initial LN size was larger (mean, 2.9 cm vs 2.5 cm; P<.01) with a higher proportion of cystic LNs (38% vs 6%, P<.01) in HPV(+) versus HPV(?) HNC. CR was achieved is 125 HPV(+) HNCs (49%) and 129 HPV(?) HNCs (55%) (P=.18). The mean post treatment largest LN was 36% of the original size in the HPV(+) group and 41% in the HPV(?) group (P<.01). The actuarial LN resolution was similar in the HPV(+) and HPV(?) groups at 12 weeks (42% and 43%, respectively), but it was higher in the HPV(+) group than in the HPV(?) group at 36 weeks (90% vs 77%, P<.01). The median follow-up period was 3.6 years. The 3-year RC rate was higher in the HPV(?) CR cases versus non-CR cases (92% vs 63%, P<.01) but was not different in the HPV(+) CR cases versus non-CR cases (98% vs 92%, P=.14). On multivariate analysis, HPV(+) status predicted ultimate LN resolution (odds ratio, 1.4 [95% confidence interval, 1.1-1.7]; P<.01) and RC (hazard ratio, 0.3 [95% confidence interval 0.2-0.6]; P<.01). Conclusions: HPV(+) LNs involute more quickly than HPV(?) LNs but undergo a more prolonged process to eventual CR beyond the time of initial assessment at 8 to 12 weeks after treatment. Post radiation neck dissection is advisable for all non-CR HPV(?)/non-CR N3 HPV(+) cases, but it may be avoided for selected non-CR N2 HPV(+) cases with a significant LN involution if they can undergo continued imaging surveillance. The role of positron emission tomography for response assessment should be investigated.
Meyer, Francois, E-mail: francois.meyer@chuq.qc.ca [Laval University Cancer Research Center, Centre hospitalier universitaire de Quebec - L'Hotel-Dieu de Quebec, Quebec (Canada); Fortin, Andre; Wang, Chang Shu [Radiation Therapy Department, Centre hospitalier universitaire de Quebec - L'Hotel-Dieu de Quebec, Quebec (Canada); Liu, Geoffrey [Applied Molecular Oncology, Ontario Cancer Institute/Princess Margaret Hospital, Toronto (Canada); Bairati, Isabelle [Laval University Cancer Research Center, Centre hospitalier universitaire de Quebec - L'Hotel-Dieu de Quebec, Quebec (Canada)
2012-03-15
Purpose: Radiation therapy (RT) causes acute and late toxicities that affect various organs and functions. In a large cohort of patients treated with RT for localized head and neck cancer (HNC), we prospectively assessed the occurrence of RT-induced acute and late toxicities and identified characteristics that predicted these toxicities. Methods and Materials: We conducted a randomized trial among 540 patients treated with RT for localized HNC to assess whether vitamin E supplementation could improve disease outcomes. Adverse effects of RT were assessed using the Radiation Therapy Oncology Group Acute Radiation Morbidity Criteria during RT and one month after RT, and the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer Late Radiation Morbidity Scoring Scheme at six and 12 months after RT. The most severe adverse effect among the organs/tissues was selected as an overall measure of either acute or late toxicity. Grade 3 and 4 toxicities were considered as severe. Stepwise multivariate logistic regression models were used to identify all independent predictors (p < 0.05) of acute or late toxicity and to estimate odds ratios (OR) for severe toxicity with their 95% confidence intervals (CI). Results: Grade 3 or 4 toxicity was observed in 23% and 4% of patients, respectively, for acute and late toxicity. Four independent predictors of severe acute toxicity were identified: sex (female vs. male: OR = 1.72, 95% confidence interval [CI]: 1.06-2.80), Karnofsky Performance Status (OR = 0.67 for a 10-point increment, 95% CI: 0.52-0.88), body mass index (above 25 vs. below: OR = 1.88, 95% CI: 1.22-2.90), TNM stage (Stage II vs. I: OR = 1.91, 95% CI: 1.25-2.92). Two independent predictors were found for severe late toxicity: female sex (OR = 3.96, 95% CI: 1.41-11.08) and weight loss during RT (OR = 1.26 for a 1 kg increment, 95% CI: 1.12-1.41). Conclusions: Knowledge of these predictors easily collected in a clinical setting could help tailoring therapies to reduce toxicities among patients treated with RT for HNC.
Nakajima, Naomi; Kataoka, Masaaki; Sugawara, Yoshifumi; Ochi, Takashi; Kiyoto, Sachiko; Ohsumi, Shozo; Mochizuki, Teruhito
2013-11-15
Purpose: To determine whether volume-based parameters on pretreatment {sup 18}F-fluorodeoxyglucose positron emission tomography/computed tomography in breast cancer patients treated with mastectomy without adjuvant radiation therapy are predictive of recurrence. Methods and Materials: We retrospectively analyzed 93 patients with 1 to 3 positive axillary nodes after surgery, who were studied with {sup 18}F-fluorodeoxyglucose positron emission tomography/computed tomography for initial staging. We evaluated the relationship between positron emission tomography parameters, including the maximum standardized uptake value, metabolic tumor volume (MTV), and total lesion glycolysis (TLG), and clinical outcomes. Results: The median follow-up duration was 45 months. Recurrence was observed in 11 patients. Metabolic tumor volume and TLG were significantly related to tumor size, number of involved nodes, nodal ratio, nuclear grade, estrogen receptor (ER) status, and triple negativity (TN) (all P values were <.05). In receiver operating characteristic curve analysis, MTV and TLG showed better predictive performance than tumor size, ER status, or TN (area under the curve: 0.85, 0.86, 0.79, 0.74, and 0.74, respectively). On multivariate analysis, MTV was an independent prognostic factor of locoregional recurrence-free survival (hazard ratio 34.42, 95% confidence interval 3.94-882.71, P=.0008) and disease-free survival (DFS) (hazard ratio 13.92, 95% confidence interval 2.65-103.78, P=.0018). The 3-year DFS rate was 93.8% for the lower MTV group (<53.1; n=85) and 25.0% for the higher MTV group (?53.1; n=8; P<.0001, log–rank test). The 3-year DFS rate for patients with both ER-positive status and MTV <53.1 was 98.2%; and for those with ER-negative status and MTV ?53.1 it was 25.0% (P<.0001). Conclusions: Volume-based parameters improve recurrence prediction in postmastectomy breast cancer patients with 1 to 3 positive nodes. The addition of MTV to ER status or TN has potential benefits to identify a subgroup at higher risk for recurrence.
Curran, Tim
& The degree of commonality between the perceptual mech- when the car and face stimuli were to faces when con- and the subject's level of car expertise as measured in an inde- currently processing visual objects of expertise. In car experts pendent behavioral task. Together, these results show
Breast Cancer After Treatment of Hodgkin's Lymphoma: Risk Factors That Really Matter
Alm El-Din, Mohamed A. [Department of Radiation Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Hughes, Kevin S. [Department of Surgical Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Finkelstein, Dianne M.; Betts, Keith A. [Department of Biostatistics, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Yock, Torunn I.; Tarbell, Nancy J. [Department of Radiation Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Aisenberg, Alan C. [Department of Medical Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States); Taghian, Alphonse G. [Department of Radiation Oncology, Massachusetts General Hospital, Harvard Medical School, Boston, MA (United States)], E-mail: ataghian@partners.org
2009-01-01
Purpose: To evaluate the risk of breast cancer (BC) and the contributing risk factors in women after supradiaphragmatic irradiation (SDI) for Hodgkin's lymphoma (HL). Subjects and Methods: Medical records of 248 women 60 years of age or less who received SDI for stage I/II HL between 1964 and 2001 at Massachusetts General Hospital were retrospectively reviewed. Results: The median age at SDI was 26 years (range, 5.7-59.3). The median follow-up was 15.2 years (range, 0.1-41.3). In 36 patients, BC developed (bilaterally in 11 patients) at a median interval of 18.4 years (range, 4.3-33.8) after SDI. Based on data from the National Cancer Institute Surveillance, Epidemiology, and End Results program, the standardized morbidity ratio (SMR) for the first BC after SDI was 9.78 (95% confidence interval [CI], 4.64-18.11, p < 0.0001). The SMR of patients who received radiation before age of 30 years was 19.05 (95% CI, 12.33-28.13) compared with 4.64 (95% CI, 2.31-8.30) for patients aged 30 years or more at the time of treatment (p < 0.00003). Risk for BC was significantly higher 15 years or more after SDI compared with the risk during the first 15 years (p = 0.0026). None of HL characteristics or treatment details was associated with higher risk of BC after adjusting for age and calendar time. Conclusions: Age at irradiation and time since therapy appear to be the only significant risk factors for development of BC after treatment of HL. The risk is significantly higher 15 years or more after radiation and for women treated before age 30 years. Long-term surveillance strategies are indicated for women at risk.
Measurement of the CP-violating phase ?sJ/?? in Bs0?J/?? decays with the CDF II detector
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Aaltonen, T.; Álvarez González, B.; Amerio, S.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Appel, J. A.; Arisawa, T.; et al
2012-04-23
We present a measurement of the CP-violating parameter ?sJ/?? using approximately 6500 B0s?J/?? decays reconstructed with the CDF II detector in a sample of pp? collisions at ?s=1.96 TeV corresponding to 5.2 fb?¹ integrated luminosity produced by the Tevatron collider at Fermilab. We find the CP-violating phase to be within the range ?sJ/??? [0.02,0.52]?[1.08,1.55] at 68% confidence level where the coverage property of the quoted interval is guaranteed using a frequentist statistical analysis. This result is in agreement with the standard model expectation at the level of about one Gaussian standard deviation. We consider the inclusion of a potential S-wavemore »contribution to the B0s?J/?K?K? final state which is found to be negligible over the mass interval 1.009sJ/??, we find the B0s decay width difference to be ??s=0.075±0.035(stat)±0.006(syst) ps?¹. We also present the most precise measurements of the B0s mean lifetime ?(B0s)=1.529±0.025(stat)±0.012(syst) ps, the polarization fractions |A0(0)|²=0.524±0.013(stat)±0.015(syst) and |A II (0)|²=0.231±0.014(stat)±0.015(syst), as well as the strong phase ??=2.95±0.64(stat)±0.07(syst) rad. In addition, we report an alternative Bayesian analysis that gives results consistent with the frequentist approach.« less
John Veitch; Vivien Raymond; Benjamin Farr; Will M. Farr; Philip Graff; Salvatore Vitale; Ben Aylott; Kent Blackburn; Nelson Christensen; Michael Coughlin; Walter Del Pozzo; Farhan Feroz; Jonathan Gair; Carl-Johan Haster; Vicky Kalogera; Tyson Littenberg; Ilya Mandel; Richard O'Shaughnessy; Matthew Pitkin; Carl Rodriguez; Christian Röver; Trevor Sidery; Rory Smith; Marc Van Der Sluys; Alberto Vecchio; Will Vousden; Leslie Wade
2015-02-16
The Advanced LIGO and Advanced Virgo gravitational wave (GW) detectors will begin operation in the coming years, with compact binary coalescence events a likely source for the first detections. The gravitational waveforms emitted directly encode information about the sources, including the masses and spins of the compact objects. Recovering the physical parameters of the sources from the GW observations is a key analysis task. This work describes the LALInference software library for Bayesian parameter estimation of compact binary signals, which builds on several previous methods to provide a well-tested toolkit which has already been used for several studies. We show that our implementation is able to correctly recover the parameters of compact binary signals from simulated data from the advanced GW detectors. We demonstrate this with a detailed comparison on three compact binary systems: a binary neutron star, a neutron star black hole binary and a binary black hole, where we show a cross-comparison of results obtained using three independent sampling algorithms. These systems were analysed with non-spinning, aligned spin and generic spin configurations respectively, showing that consistent results can be obtained even with the full 15-dimensional parameter space of the generic spin configurations. We also demonstrate statistically that the Bayesian credible intervals we recover correspond to frequentist confidence intervals under correct prior assumptions by analysing a set of 100 signals drawn from the prior. We discuss the computational cost of these algorithms, and describe the general and problem-specific sampling techniques we have used to improve the efficiency of sampling the compact binary coalescence parameter space.
STUDIES OF WALL FLAME QUENCHING AND HYDROCARBON EMISSIONS IN A MODEL SPARK IGNITION ENGINE
Ishikawa, Nobuhiko
2011-01-01
ignition timing at 10 msec BTC, time interval 5 msec. flatignition, ignition timing at 12 BTC, time interval 5 msec .ignition timing at 25 msec BTC, time interval 5 msec . . . .
Abbott, B; Adhikari, R; Agresti, J; Ajith, P; Allen, B; Amin, R; Anderson, S B; Anderson, W G; Arain, M; Araya, M; Armandula, H; Ashley, M; Aston, S; Aufmuth, P; Aulbert, C; Babak, S; Ballmer, S; Bantilan, H; Barish, B C; Barker, C; Barker, D; Barr, B; Barriga, P; Barton, M A; Bayer, K; Belczynski, K; Betzwieser, J; Beyersdorf, P T; Bhawal, B; Bilenko, I A; Billingsley, G; Biswas, R; Black, E; Blackburn, K; Blackburn, L; Blair, D; Bland, B; Bogenstahl, J; Bogue, L; Bork, R; Boschi, V; Bose, S; Brady, P R; Braginsky, V B; Brau, J E; Brinkmann, M; Brooks, A; Brown, D A; Bullington, A; Bunkowski, A; Buonanno, A; Burmeister, O; Busby, D; Byer, R L; Cadonati, L; Cagnoli, G; Camp, J B; Cannizzo, J; Cannon, K; Cantley, C A; Cao, J; Cardenas, L; Casey, M M; Castaldi, G; Cepeda, C; Chalkey, E; Charlton, P; Chatterji, S; Chelkowski, S; Chen, Y; Chiadini, F; Chin, D; Chin, E; Chow, J; Christensen, N; Clark, J; Cochrane, P; Cokelaer, T; Colacino, C N; Coldwell, R; Conte, R; Cook, D; Corbitt, T; Coward, D; Coyne, D; Creighton, J D E; Creighton, T D; Croce, R P; Crooks, D R M; Cruise, A M; Cumming, A; Dalrymple, J; D'Ambrosio, E; Danzmann, K; Davies, G; De Bra, D; Degallaix, J; Degree, M; Demma, T; Dergachev, V; Desai, S; DeSalvo, R; Dhurandhar, S V; Díaz, M; Dickson, J; Di Credico, A; Diederichs, G; Dietz, A; Doomes, E E; Drever, R W P; Dumas, J C; Dupuis, R J; Dwyer, J G; Ehrens, P; Espinoza, E; Etzel, T; Evans, M; Evans, T; Fairhurst, S; Fan, Y; Fazi, D; Fejer, M M; Finn, L S; Fiumara, V; Fotopoulos, N; Franzen, A; Franzen, K Y; Freise, A; Frey, R; Fricke, T; Fritschel, P; Frolov, V V; Fyffe, M; Galdi, V; Garofoli, J; Gholami, I; Giaime, J A; Giampanis, S; Giardina, K D; Goda, K; Goetz, E; Goggin, L; González, G; Gossler, S; Grant, A; Gras, S; Gray, C; Gray, M; Greenhalgh, J; Gretarsson, A M; Grosso, R; Grote, H; Grünewald, S; Günther, M; Gustafson, R; Hage, B; Hammer, D; Hanna, C; Hanson, J; Harms, J; Harry, G; Harstad, E; Hayler, T; Heefner, J; Heng, I S; Heptonstall, A; Heurs, M; Hewitson, M; Hild, S; Hirose, E; Hoak, D; Hosken, D; Hough, J; Howell, E; Hoyland, D; Huttner, S H; Ingram, D; Innerhofer, E; Ito, M; Itoh, Y; Ivanov, A; Jackrel, D; Johnson, B; Johnson, W W; Jones, D I; Jones, G; Jones, R; Ju, L; Kalmus, Peter Ignaz Paul; Kalogera, V; Kamat, S; Kasprzyk, D; Katsavounidis, E; Kawabe, K; Kawamura, S; Kawazoe, F; Kells, W; Keppel, D G; Khalili, F Ya; Kim, C; King, P; Kissel, J S; Klimenko, S; Kokeyama, K; Kondrashov, V; Kopparapu, R K; Kozak, D; Krishnan, B; Kwee, P; Lam, P K; Landry, M; Lantz, B; Lazzarini, A; Lee, B; Lei, M; Leiner, J; Leonhardt, V; Leonor, I; Libbrecht, K; Lindquist, P; Lockerbie, N A; Longo, M; Lormand, M; Lubinski, M; Luck, H; Machenschalk, B; MacInnis, M; Mageswaran, M; Mailand, K; Malec, M; Mandic, V; Marano, S; Marka, S; Markowitz, J; Maros, E; Martin, I; Marx, J N; Mason, K; Matone, L; Matta, V; Mavalvala, N; McCarthy, R; McClelland, D E; McGuire, S C; McHugh, M; McKenzie, K; McNabb, J W C; McWilliams, S; Meier, T; Melissinos, A C; Mendell, G; Mercer, R A; Meshkov, S; Messaritaki, E; Messenger, C J; Meyers, D; Mikhailov, E; Mitra, S; Mitrofanov, V P; Mitselmakher, G; Mittleman, R; Miyakawa, O; Mohanty, S; Moreno, G; Mossavi, K; Mow Lowry, C; Moylan, A; Mudge, D; Müller, G; Mukherjee, S; Muller-Ebhardt, H; Munch, J; Murray, P; Myers, E; Myers, J; Newton, G; Nishizawa, A; Numata, K; O'Reilly, B; O'Shaughnessy, R; Ottaway, D J; Overmier, H; Owen, B J; Pan, Y; Papa, M A; Parameshwaraiah, V; Patel, P; Pedraza, M; Penn, S; Pierro, V; Pinto, I M; Pitkin, M; Pletsch, H; Plissi, M V; Postiglione, F; Prix, R; Quetschke, V; Raab, F; Rabeling, D; Radkins, H; Rahkola, R; Rainer, N; Rakhmanov, M; Ray-Majumder, S; Re, V; Rehbein, H; Reid, S; Reitze, D H; Ribichini, L; Riesen, R; Riles, K; Rivera, B; Robertson, N A; Robinson, C; Robinson, E L; Roddy, S; Rodríguez, A; Rogan, A M; Rollins, J; Romano, J D; Romie, J; Route, R; Rowan, S; Rüdiger, A; Ruet, L; Russell, P; Ryan, K; Sakata, S; Samidi, M; Sancho de la Jordana, L; Sandberg, V; Sannibale, V; Saraf, S; Sarin, P; Sathyaprakash, B S; Sato, S; Saulson, P R; Savage, R; Savov, P; Schediwy, S; Schilling, R; Schnabel, R; Schofield, R; Schutz, B F; Schwinberg, P; Scott, S M; Searle, A C; Sears, B; Seifert, F; Sellers, D; Sengupta, A S; Shawhan, P; Shoemaker, D H; Sibley, A; Sidles, J A; Siemens, X; Sigg, D; Sinha, S; Sintes, A M; Slagmolen, B; Slutsky, J; Smith, J R; Smith, M R; Somiya, K; Strain, K A; Strom, D M; Stuver, A; Summerscales, T Z; Sun, K X; Sung, M; Sutton, P J; Takahashi, H; Tanner, D B; Tarallo, M; Taylor, R; Thacker, J; Thorne, K A; Thorne, K S; Thüring, A; Tokmakov, K V; Torres, C; Torrie, C; Traylor, G; Trias, M; Tyler, W; Ugolini, D W; Ungarelli, C; Urbanek, K; Vahlbruch, H; Vallisneri, M; Van Den Broeck, C; Varvella, M; Vass, S; Vecchio, A; Veitch, J; Veitch, P; Villar, A; Vorvick, C; Vyachanin, S P; Waldman, S J
2007-01-01
We have searched for Gravitational Waves (GWs) associated with the SGR 1806-20 hyperflare of 27 December 2004. This event, originating from a Galactic neutron star, displayed exceptional energetics. Recent investigations of the X-ray light curve's pulsating tail revealed the presence of Quasi-Periodic Oscillations (QPOs) in the 30 - 2000 Hz frequency range, most of which coincides with the bandwidth of the LIGO detectors. These QPOs, with well-characterized frequencies, can plausibly be attributed to seismic modes of the neutron star which could emit GWs. Our search targeted potential quasi-monochromatic GWs lasting for tens of seconds and emitted at the QPO frequencies. We have observed no candidate signals above a pre-determined threshold and our lowest upper limit was set by the 92.5 Hz QPO observed in the interval from 150 s to 260 s after the start of the flare. This bound corresponds to a (90% confidence) root-sum-squared amplitude h_rssdet^90% = 4.5e-22 strain Hz^-1/2 on the GW waveform strength in the...
Materials corrosion of high temperature alloys immersed in 600C binary nitrate salt.
Kruizenga, Alan Michael; Gill, David Dennis; LaFord, Marianne Elizabeth
2013-03-01
Thirteen high temperature alloys were immersion tested in a 60/40 binary nitrate salt. Samples were interval tested up to 3000 hours at 600%C2%B0C with air as the ullage gas. Chemical analysis of the molten salt indicated lower nitrite concentrations present in the salt, as predicted by the equilibrium equation. Corrosion rates were generally low for all alloys. Corrosion products were identified using x-ray diffraction and electron microprobe analysis. Fe-Cr based alloys tended to form mixtures of sodium and iron oxides, while Fe-Ni/Cr alloys had similar corrosion products plus oxides of nickel and chromium. Nickel based alloys primarily formed NiO, with chromium oxides near the oxide/base alloy interface. In625 exhibited similar corrosion performance in relation to previous tests, lending confidence in comparisons between past and present experiments. HA230 exhibited internal oxidation that consisted of a nickel/chromium oxide. Alloys with significant aluminum alloying tended to exhibit superior performance, due formation of a thin alumina layer. Soluble corrosion products of chromium, molybdenum, and tungsten were also formed and are thought to be a significant factor in alloy performance.
Using Weibull Distribution Analysis to Evaluate ALARA Performance
E. L. Frome, J. P. Watkins, and D. A. Hagemeyer
2009-10-01
As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.
Measures of agreement between computation and experiment:validation metrics.
Barone, Matthew Franklin; Oberkampf, William Louis
2005-08-01
With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.
Search for Stopped Gluinos in pp collisions at sqrt s = 7 TeV
Khachatryan, Vardan; et al.
2011-01-01
The results of the first search for long-lived gluinos produced in 7 TeV pp collisions at the CERN Large Hadron Collider are presented. The search looks for evidence of long-lived particles that stop in the CMS detector and decay in the quiescent periods between beam crossings. In a dataset with a peak instantaneous luminosity of 10^{32} cm^{-2} s^{-1}, an integrated luminosity of 10 inverse picobarns, and a search interval corresponding to 62 hours of LHC operation, no significant excess above background was observed. Limits at the 95% confidence level on gluino pair production over 13 orders of magnitude of gluino lifetime are set. For a mass difference between the gluino and the neutralino greater than 100 GeV/c^2, and assuming a branching ratio for gluino to gluon+neutralino of 100%, gluinos of mass less than 370 GeV/c^2 are excluded for lifetimes from 10 microseconds to 1000 s.
A Direct Top-Quark Width Measurement from Lepton + Jets Events at CDF II
Aaltonen, T.; Alvarez Gonzalez, B.; Amerio, S.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Appel, J.A.; Apresyan, A.; Arisawa, T.; /Waseda U. /Dubna, JINR
2010-08-01
We present a measurement of the top-quark width using t{bar t} events produced in p{bar p} collisions at Fermilab's Tevatron collider and collected by the CDF II detector. In the mode where the top quark decays to a W boson and a bottom quark, we select events in which one W decays leptonically and the other hadronically (lepton + jets channel) . From a data sample corresponding to 4.3 fb{sup -1} of integrated luminosity, we identify 756 candidate events. The top-quark mass and the mass of W boson that decays hadronically are reconstructed for each event and compared with templates of different top-quark widths ({Lambda}{sub t}) and deviations from nominal jet energy scale ({Delta}{sub JES}) to perform a simultaneous fit for both parameters, where {Delta}{sub JES} is used for the in situ calibration of the jet energy scale. By applying a Feldman-Cousins approach, we establish an upper limit at 95% confidence level (CL) of {Lambda}{sub t} < 7.6 GeV and a two-sided 68% CL interval of 0.3 GeV < {Lambda}{sub t} < 4.4 GeV for a top-quark mass of 172.5 GeV/c{sup 2}, which are consistant with the standard model prediction. This is the first direct measurement of {Lambda}{sub t} to set a lower limit with 68% CL.
Thermal Versus Impedance-Based Ablation of Renal Cell Carcinoma: A Meta-analysis
Modabber, Milad, E-mail: mmodabber@gmail.com; Martin, Jason, E-mail: jason.martin@medportal.ca [McMaster University, DeGroote School of Medicine (Canada); Athreya, Sriharsha, E-mail: sathreya@stjosham.on.ca [McMaster University, Faculty of Health Sciences (Canada)
2013-10-04
BackgroundPercutaneous radiofrequency ablation (RFA) of renal carcinoma has become an established treatment modality. However, thermal (TB) versus impedance-based (IB)-RF generators have not been previously compared.MethodsA literature search on the application of RFA for renal masses using TB or IB-RF generators was performed. The safety, efficacy, and long-term outcomes of TB versus IB-based RFA were assessed using the outcome measures of technical success, local recurrence rate, complications, and preservation of renal function.ResultsAcross the 27 included studies, pooled results suggested comparable results for technical success (TB-RFA 98.53 % vs. IB-RFA 98.78 %, P = 0.9813). Clinical efficacy results were also similar across both generators (91.0 % TB-RFA vs. 91.5 % IB-RFA; P = 0.73). At follow-up, no differences in renal function (relative risk [RR] 0.5, 95 % confidence interval [CI] 0.45–5.48), and local recurrence (RR 0.717, 95 % CI 0.49–1.50) were observed. The pooled proportion of overall complication rates was 13.1 % for TB-RFA and 11.5 % for IB-RFA.ConclusionNo differences in the observed parameters were found either during surgery or at follow-up.
Grimm, Lars J., E-mail: Lars.grimm@duke.edu; Ghate, Sujata V.; Yoon, Sora C.; Kim, Connie [Department of Radiology, Duke University Medical Center, Box 3808, Durham, North Carolina 27710 (United States)] [Department of Radiology, Duke University Medical Center, Box 3808, Durham, North Carolina 27710 (United States); Kuzmiak, Cherie M. [Department of Radiology, University of North Carolina School of Medicine, 2006 Old Clinic, CB No. 7510, Chapel Hill, North Carolina 27599 (United States)] [Department of Radiology, University of North Carolina School of Medicine, 2006 Old Clinic, CB No. 7510, Chapel Hill, North Carolina 27599 (United States); Mazurowski, Maciej A. [Duke University Medical Center, Box 2731 Medical Center, Durham, North Carolina 27710 (United States)] [Duke University Medical Center, Box 2731 Medical Center, Durham, North Carolina 27710 (United States)
2014-03-15
Purpose: The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Methods: Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Results: Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502–0.739, 95% Confidence Interval: 0.543–0.680,p < 0.002). Conclusions: Patterns in detection errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees.