Classification with asymmetric label noise: Consistency and maximal denoising
Abstract
In many real-world classification problems, the labels of training examples are randomly corrupted. Most previous theoretical work on classification with label noise assumes that the two classes are separable, that the label noise is independent of the true class label, or that the noise proportions for each class are known. In this work, we give conditions that are necessary and sufficient for the true class-conditional distributions to be identifiable. These conditions are weaker than those analyzed previously, and allow for the classes to be nonseparable and the noise levels to be asymmetric and unknown. The conditions essentially state that a majority of the observed labels are correct and that the true class-conditional distributions are “mutually irreducible,” a concept we introduce that limits the similarity of the two distributions. For any label noise problem, there is a unique pair of true class-conditional distributions satisfying the proposed conditions, and we argue that this pair corresponds in a certain sense to maximal denoising of the observed distributions. Our results are facilitated by a connection to “mixture proportion estimation,” which is the problem of estimating the maximal proportion of one distribution that is present in another. We establish a novel rate of convergence resultmore »
- Authors:
-
- Univ. of Potsdam (Germany). Inst. for Mathematics
- Pennsylvania State Univ., University Park, PA (United States). Dept. of Mechanical and Nuclear Engineering
- Univ. of Utah, Salt Lake City, UT (United States). Dept. of Mathematics
- Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Nuclear Engineering and Radiological Sciences
- Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Electrical and Computer Engineering, Statistics
- Publication Date:
- Research Org.:
- Univ. of Michigan, Ann Arbor, MI (United States)
- Sponsoring Org.:
- USDOE National Nuclear Security Administration (NNSA), Office of Nonproliferation and Verification Research and Development (NA-22)
- OSTI Identifier:
- 1366694
- Grant/Contract Number:
- NA0002534
- Resource Type:
- Journal Article: Accepted Manuscript
- Journal Name:
- Electronic Journal of Statistics
- Additional Journal Information:
- Journal Volume: 10; Journal Issue: 2; Journal ID: ISSN 1935-7524
- Publisher:
- Institute of Mathematical Statistics (IMS) and the Bernoulli Society
- Country of Publication:
- United States
- Language:
- English
- Subject:
- 73 NUCLEAR PHYSICS AND RADIATION PHYSICS; Classification, label noise, mixture proportion estimation, surrogate loss, consistency
Citation Formats
Blanchard, Gilles, Flaska, Marek, Handy, Gregory, Pozzi, Sara, and Scott, Clayton. Classification with asymmetric label noise: Consistency and maximal denoising. United States: N. p., 2016.
Web. doi:10.1214/16-EJS1193.
Blanchard, Gilles, Flaska, Marek, Handy, Gregory, Pozzi, Sara, & Scott, Clayton. Classification with asymmetric label noise: Consistency and maximal denoising. United States. https://doi.org/10.1214/16-EJS1193
Blanchard, Gilles, Flaska, Marek, Handy, Gregory, Pozzi, Sara, and Scott, Clayton. 2016.
"Classification with asymmetric label noise: Consistency and maximal denoising". United States. https://doi.org/10.1214/16-EJS1193. https://www.osti.gov/servlets/purl/1366694.
@article{osti_1366694,
title = {Classification with asymmetric label noise: Consistency and maximal denoising},
author = {Blanchard, Gilles and Flaska, Marek and Handy, Gregory and Pozzi, Sara and Scott, Clayton},
abstractNote = {In many real-world classification problems, the labels of training examples are randomly corrupted. Most previous theoretical work on classification with label noise assumes that the two classes are separable, that the label noise is independent of the true class label, or that the noise proportions for each class are known. In this work, we give conditions that are necessary and sufficient for the true class-conditional distributions to be identifiable. These conditions are weaker than those analyzed previously, and allow for the classes to be nonseparable and the noise levels to be asymmetric and unknown. The conditions essentially state that a majority of the observed labels are correct and that the true class-conditional distributions are “mutually irreducible,” a concept we introduce that limits the similarity of the two distributions. For any label noise problem, there is a unique pair of true class-conditional distributions satisfying the proposed conditions, and we argue that this pair corresponds in a certain sense to maximal denoising of the observed distributions. Our results are facilitated by a connection to “mixture proportion estimation,” which is the problem of estimating the maximal proportion of one distribution that is present in another. We establish a novel rate of convergence result for mixture proportion estimation, and apply this to obtain consistency of a discrimination rule based on surrogate loss minimization. Experimental results on benchmark data and a nuclear particle classification problem demonstrate the efficacy of our approach. MSC 2010 subject classifications: Primary 62H30; secondary 68T10. Keywords and phrases: Classification, label noise, mixture proportion estimation, surrogate loss, consistency.},
doi = {10.1214/16-EJS1193},
url = {https://www.osti.gov/biblio/1366694},
journal = {Electronic Journal of Statistics},
issn = {1935-7524},
number = 2,
volume = 10,
place = {United States},
year = {Tue Sep 20 00:00:00 EDT 2016},
month = {Tue Sep 20 00:00:00 EDT 2016}
}
Web of Science
Works referencing / citing this record:
Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions
journal, February 2018
- Li, Alexander Hanbo; Bradic, Jelena
- Journal of the American Statistical Association, Vol. 113, Issue 522
Learning to classify from impure samples with high-dimensional data
journal, July 2018
- Komiske, Patrick T.; Metodiev, Eric M.; Nachman, Benjamin
- Physical Review D, Vol. 98, Issue 1
Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions
text, January 2018
- Li, Alexander Hanbo; Bradic, Jelena
- Taylor & Francis
Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions
text, January 2018
- Li, Alexander Hanbo; Bradic, Jelena
- Taylor & Francis
Boosting in the Presence of Outliers: Adaptive Classification with Non-convex Loss Functions
text, January 2017
- Li, Alexander Hanbo; Bradic, Jelena
- Taylor & Francis
Learning to Classify from Impure Samples with High-Dimensional Data
text, January 2018
- Komiske, Patrick T.; Metodiev, Eric M.; Nachman, Benjamin
- arXiv