Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
The Annals of Statistics 2007, Vol. 35, No. 2, 608633
 

Summary: The Annals of Statistics
2007, Vol. 35, No. 2, 608­633
DOI: 10.1214/009053606000001217
© Institute of Mathematical Statistics, 2007
FAST LEARNING RATES FOR PLUG-IN CLASSIFIERS
BY JEAN-YVES AUDIBERT AND ALEXANDRE B. TSYBAKOV
École Nationale des Ponts et Chaussées and Université Paris VI
It has been recently shown that, under the margin (or low noise) assump-
tion, there exist classifiers attaining fast rates of convergence of the excess
Bayes risk, that is, rates faster than n-1/2. The work on this subject has sug-
gested the following two conjectures: (i) the best achievable fast rate is of
the order n-1, and (ii) the plug-in classifiers generally converge more slowly
than the classifiers based on empirical risk minimization. We show that both
conjectures are not correct. In particular, we construct plug-in classifiers that
can achieve not only fast, but also super-fast rates, that is, rates faster than
n-1. We establish minimax lower bounds showing that the obtained rates
cannot be improved.
1. Introduction. Let (X,Y) be a random couple taking values in Z = Rd ×
{0,1} with joint distribution P . We regard X Rd as a vector of features corre-
sponding to an object and Y {0,1} as a label indicating that the object belongs to

  

Source: Audibert, Jean-Yves - Département d'Informatique, École Normale Supérieure
Tsybakov, Alexandre - Laboratoire de Probabilités et Modèles Aléatoires, Université Pierre-et-Marie-Curie, Paris 6

 

Collections: Computer Technologies and Information Sciences; Mathematics