 
Summary: SIAM J. NUMER. ANAL. c 2009 Society for Industrial and Applied Mathematics
Vol. 47, No. 2, pp. 9971018
ACCELERATED LINESEARCH AND TRUSTREGION METHODS
P.A. ABSIL AND K. A. GALLIVAN
Abstract. In numerical optimization, linesearch and trustregion methods are two important
classes of descent schemes, with wellunderstood global convergence properties. We say that these
methods are "accelerated" when the conventional iterate is replaced by any point that produces at
least as much of a decrease in the cost function as a fixed fraction of the decrease produced by the
conventional iterate. A detailed convergence analysis reveals that global convergence properties of
linesearch and trustregion methods still hold when the methods are accelerated. The analysis is
performed in the general context of optimization on manifolds, of which optimization in Rn is a
particular case. This general convergence analysis sheds new light on the behavior of several existing
algorithms.
Key words. line search, trust region, subspace acceleration, sequential subspace method, Rie
mannian manifold, optimization on manifolds, Riemannian optimization, Arnoldi, JacobiDavidson,
locally optimal block preconditioned conjugate gradient (LOBPCG)
AMS subject classifications. 65B99, 65K05, 65J05, 65F15, 90C30
DOI. 10.1137/08072019X
1. Introduction. Let f be a realvalued function defined on a domain M, and
let {xk} be a sequence of iterates generated as follows: for every k, some xk+1/2 M
