Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
CGBoost: Conjugate Gradient in Function Ling Li Yaser S. Abu-Mostafa Amrit Pratap
 

Summary: CGBoost: Conjugate Gradient in Function
Space
Ling Li Yaser S. Abu-Mostafa Amrit Pratap
Learning Systems Group, California Institute of Technology,
Pasadena, CA 91125, USA
{ling,yaser,amrit}@caltech.edu
Abstract
The superior out-of-sample performance of AdaBoost has been attributed
to the fact that it minimizes a cost function based on margin, in that it can
be viewed as a special case of AnyBoost, an abstract gradient descent
algorithm. In this paper, we provide a more sophisticated abstract boost-
ing algorithm, CGBoost, based on conjugate gradient in function space.
When the AdaBoost exponential cost function is optimized, CGBoost
generally yields much lower cost and training error but higher test error,
which implies that the exponential cost is vulnerable to overfitting. With
the optimization power of CGBoost, we can adopt more "regularized"
cost functions that have better out-of-sample performance but are diffi-
cult to optimize. Our experiments demonstrate that CGBoost generally
outperforms AnyBoost in cost reduction. With suitable cost functions,
CGBoost can have better out-of-sample performance.

  

Source: Abu-Mostafa, Yaser S. - Department of Mechanical Engineering & Computer Science Department, California Institute of Technology

 

Collections: Computer Technologies and Information Sciences