 
Summary: The Annals of Statistics
2007, Vol. 35, No. 2, 608633
DOI: 10.1214/009053606000001217
© Institute of Mathematical Statistics, 2007
FAST LEARNING RATES FOR PLUGIN CLASSIFIERS
BY JEANYVES AUDIBERT AND ALEXANDRE B. TSYBAKOV
École Nationale des Ponts et Chaussées and Université Paris VI
It has been recently shown that, under the margin (or low noise) assump
tion, there exist classifiers attaining fast rates of convergence of the excess
Bayes risk, that is, rates faster than n1/2. The work on this subject has sug
gested the following two conjectures: (i) the best achievable fast rate is of
the order n1, and (ii) the plugin classifiers generally converge more slowly
than the classifiers based on empirical risk minimization. We show that both
conjectures are not correct. In particular, we construct plugin classifiers that
can achieve not only fast, but also superfast rates, that is, rates faster than
n1. We establish minimax lower bounds showing that the obtained rates
cannot be improved.
1. Introduction. Let (X,Y) be a random couple taking values in Z = Rd ×
{0,1} with joint distribution P . We regard X Rd as a vector of features corre
sponding to an object and Y {0,1} as a label indicating that the object belongs to
