Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Inexact Newton-CG algorithms with complexity guarantees

Journal Article · · IMA Journal of Numerical Analysis

Abstract

We consider variants of a recently developed Newton-CG algorithm for nonconvex problems (Royer, C. W. & Wright, S. J. (2018) Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization. SIAM J. Optim., 28, 1448–1477) in which inexact estimates of the gradient and the Hessian information are used for various steps. Under certain conditions on the inexactness measures, we derive iteration complexity bounds for achieving $$\epsilon $$-approximate second-order optimality that match best-known lower bounds. Our inexactness condition on the gradient is adaptive, allowing for crude accuracy in regions with large gradients. We describe two variants of our approach, one in which the step size along the computed search direction is chosen adaptively, and another in which the step size is pre-defined. To obtain second-order optimality, our algorithms will make use of a negative curvature direction on some steps. These directions can be obtained, with high probability, using the randomized Lanczos algorithm. In this sense, all of our results hold with high probability over the run of the algorithm. We evaluate the performance of our proposed algorithms empirically on several machine learning models. Our approach is a first attempt to introduce inexact Hessian and/or gradient information into the Newton-CG algorithm of Royer & Wright (2018, Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization. SIAM J. Optim., 28, 1448–1477).

Research Organization:
US Department of Energy (USDOE), Washington, DC (United States). Office of Science, Exascale Computing Project
Sponsoring Organization:
USDOE
OSTI ID:
2424937
Journal Information:
IMA Journal of Numerical Analysis, Journal Name: IMA Journal of Numerical Analysis Journal Issue: 3 Vol. 43; ISSN 0272-4979
Publisher:
Oxford University Press/Institute of Mathematics and its Applications
Country of Publication:
United States
Language:
English

References (23)

Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results journal May 2009
Sub-sampled Newton methods journal November 2018
A Stochastic Line Search Method with Expected Complexity Analysis journal January 2020
Efficient BackProp book January 2012
Cubic regularization of Newton method and its global performance journal April 2006
A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization journal January 2019
Newton-type methods for non-convex optimization under inexact Hessian information journal May 2019
Some NP-complete problems in quadratic and nonlinear programming journal June 1987
Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization journal January 2019
Most Tensor Problems Are NP-Hard journal November 2013
Complexity and global rates of trust-region methods based on probabilistic models journal August 2017
Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity journal January 2010
Convergence of Newton-MR under Inexact Hessian Information journal January 2021
The Conjugate Gradient Method and Trust Regions in Large Scale Optimization journal June 1983
LIBSVM: A library for support vector machines journal April 2011
Global convergence rate analysis of unconstrained optimization methods based on probabilistic models journal April 2017
Convergence Rate Analysis of a Stochastic Trust-Region Method via Supermartingales journal April 2019
Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization journal January 2021
Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization journal January 2018
Second-order Optimization for Non-convex Machine Learning: an Empirical Study book January 2020
Complexity bounds for second-order optimality in unconstrained optimization journal February 2012
First-order and Stochastic Optimization Methods for Machine Learning book January 2020
Trust Region Methods book January 2000

Similar Records

Exact and inexact subsampled Newton methods for optimization
Journal Article · Tue Apr 03 00:00:00 EDT 2018 · IMA Journal of Numerical Analysis · OSTI ID:1610078

An inexact regularized Newton framework with a worst-case iteration complexity of $ {\mathscr O}(\varepsilon^{-3/2}) $ for nonconvex optimization
Journal Article · Tue May 08 00:00:00 EDT 2018 · IMA Journal of Numerical Analysis · OSTI ID:1611471

An inexact regularized Newton framework with a worst-case iteration complexity of $ {\mathscr O}(\varepsilon^{-3/2}) $ for nonconvex optimization
Journal Article · Tue May 08 00:00:00 EDT 2018 · IMA Journal of Numerical Analysis · OSTI ID:1436375

Related Subjects