Summary: CGBoost: Conjugate Gradient in Function
Ling Li Yaser S. AbuMostafa Amrit Pratap
Learning Systems Group, California Institute of Technology,
Pasadena, CA 91125, USA
The superior outofsample performance of AdaBoost has been attributed
to the fact that it minimizes a cost function based on margin, in that it can
be viewed as a special case of AnyBoost, an abstract gradient descent
algorithm. In this paper, we provide a more sophisticated abstract boost
ing algorithm, CGBoost, based on conjugate gradient in function space.
When the AdaBoost exponential cost function is optimized, CGBoost
generally yields much lower cost and training error but higher test error,
which implies that the exponential cost is vulnerable to overfitting. With
the optimization power of CGBoost, we can adopt more ``regularized''
cost functions that have better outofsample performance but are diffi
cult to optimize. Our experiments demonstrate that CGBoost generally
outperforms AnyBoost in cost reduction. With suitable cost functions,
CGBoost can have better outofsample performance.