Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

An investigation of Newton-Sketch and subsampled Newton methods

Journal Article · · Optimization Methods and Software
Sketching, a dimensionality reduction technique, has received much attention in the statistics community. In this paper, we study sketching in the context of Newton's method for solving finite-sum optimization problems in which the number of variables and data points are both large. In this work, we study two forms of sketching that perform dimensionality reduction in data space: Hessian subsampling and randomized Hadamard transformations. Each has its own advantages, and their relative tradeoffs have not been investigated in the optimization literature. Additionally, our study focuses on practical versions of the two methods in which the resulting linear systems of equations are solved approximately, at every iteration, using an iterative solver. The advantages of using the conjugate gradient method vs. a stochastic gradient iteration are revealed through a set of numerical experiments, and a complexity analysis of the Hessian subsampling method is presented.
Research Organization:
Argonne National Laboratory (ANL), Argonne, IL (United States)
Sponsoring Organization:
National Science Foundation (NSF); US Office of Naval Research (ONR); USDOD Defense Advanced Research Projects Agency (DARPA); USDOE
Grant/Contract Number:
AC02-06CH11357
OSTI ID:
1657509
Journal Information:
Optimization Methods and Software, Journal Name: Optimization Methods and Software Journal Issue: 4 Vol. 35; ISSN 1055-6788
Publisher:
Taylor & FrancisCopyright Statement
Country of Publication:
United States
Language:
English

References (14)

Second-Order Stochastic Optimization for Machine Learning in Linear Time text January 2016
Numerical Optimization book January 2006
Numerical Optimization book January 1999
Faster least squares approximation journal October 2010
Coordinate descent algorithms journal March 2015
Sub-sampled Newton methods journal November 2018
Exact and inexact subsampled Newton methods for optimization journal April 2018
New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property journal January 2011
Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence journal January 2017
Optimization Methods for Large-Scale Machine Learning journal January 2018
Sketching meets random projection in the dual: A provable recovery algorithm for big and high-dimensional data journal January 2017
A Stochastic Approximation Method journal September 1951
Computational Advertising: Techniques for Targeting Relevant Ads journal January 2014
Exact and Inexact Subsampled Newton Methods for Optimization preprint January 2016

Cited By (14)

Newton-MR: Inexact Newton Method with minimum residual sub-problem solver journal January 2022
Regularization by denoising sub-sampled Newton method for spectral CT multi-material decomposition
  • Perelli, Alessandro; Andersen, Martin S.
  • Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, Vol. 379, Issue 2200 https://doi.org/10.1098/rsta.2020.0191
journal May 2021
Accelerating Adaptive Cubic Regularization of Newton's Method via Random Sampling preprint January 2018
Adaptive Cubic Regularization Methods with Dynamic Inexact Hessian Information and Applications to Finite-Sum Minimization preprint January 2018
Convergence Analysis of Inexact Randomized Iterative Methods preprint January 2019
OverSketched Newton: Fast Convex Optimization for Serverless Systems preprint January 2019
A Stochastic Extra-Step Quasi-Newton Method for Nonsmooth Nonconvex Optimization preprint January 2019
M-IHS: An Accelerated Randomized Preconditioning Method Avoiding Costly Matrix Decompositions preprint January 2019
A Multilevel Approach to Training preprint January 2020
Scalable Derivative-Free Optimization for Nonlinear Least-Squares Problems text January 2020
Generalization of Quasi-Newton Methods: Application to Robust Symmetric Multisecant Updates preprint January 2020
An Adaptive Stochastic Sequential Quadratic Programming with Differentiable Exact Augmented Lagrangians preprint January 2021
Scalable Subspace Methods for Derivative-Free Nonlinear Least-Squares Optimization preprint January 2021
A Multilevel Method for Self-Concordant Minimization preprint January 2021

Similar Records

Exact and inexact subsampled Newton methods for optimization
Journal Article · Tue Apr 03 00:00:00 EDT 2018 · IMA Journal of Numerical Analysis · OSTI ID:1610078

An inexact semismooth Newton method with application to adaptive randomized sketching for dynamic optimization
Journal Article · Tue Oct 17 20:00:00 EDT 2023 · Finite Elements in Analysis and Design · OSTI ID:2311364

A Scalable Interior‐Point Gauss–Newton Method for PDE‐Constrained Optimization With Bound Constraints
Journal Article · Sat Nov 29 19:00:00 EST 2025 · Numerical Linear Algebra with Applications · OSTI ID:3014007