Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
No fast exponential deviation inequalities for the progressive mixture rule
 

Summary: No fast exponential deviation inequalities for the
progressive mixture rule
Jean-Yves Audibert
CERTIS - Ecole des Ponts
19, rue Alfred Nobel - Cit´e Descartes
77455 Marne-la-Vall´ee - France
audibert@certis.enpc.fr
Abstract. We consider the learning task consisting in predicting as well
as the best function in a finite reference set G up to the smallest possible
additive term. If R(g) denotes the generalization error of a prediction
function g, under reasonable assumptions on the loss function (typically
satisfied by the least square loss when the output is bounded), it is known
that the progressive mixture rule ^g satisfies
ER(^g) mingG R(g) + C log |G|
n
, (1)
where n denotes the size of the training set, E denotes the expectation
w.r.t. the training set distribution and C denotes a positive constant.
This work mainly shows that for any training set size n, there exist > 0,
a reference set G and a probability distribution generating the data such

  

Source: Audibert, Jean-Yves - Département d'Informatique, École Normale Supérieure

 

Collections: Computer Technologies and Information Sciences