Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Stochastic Trust-Region Algorithm in Random Subspaces with Convergence and Expected Complexity Analyses

Journal Article · · SIAM Journal on Optimization
DOI:https://doi.org/10.1137/22m1524072· OSTI ID:2480304
 [1];  [2]
  1. Argonne National Laboratory (ANL), Argonne, IL (United States)
  2. Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)
Here, this work proposes a framework for large-scale stochastic derivative-free optimization (DFO) by introducing STARS, a trust-region method based on iterative minimization in random subspaces. This framework is both an algorithmic and theoretical extension of a random subspace derivative-free optimization (RSDFO) framework, and an algorithm for stochastic optimization with random models (STORM). Moreover, like RSDFO, STARS achieves scalability by minimizing interpolation models that approximate the objective in low-dimensional affine subspaces, thus significantly reducing per-iteration costs in terms of function evaluations and yielding strong performance on largescale stochastic DFO problems. The user-determined dimension of these subspaces, when the latter are defined, for example, by the columns of so-called Johnson-Lindenstrauss transforms, turns out to be independent of the dimension of the problem. For convergence purposes, inspired by the analyses of RSDFO and STORM, both a particular quality of the subspace and the accuracies of random function estimates and models are required to hold with sufficiently high, but fixed, probabilities. Using martingale theory under the latter assumptions, an almost sure global convergence of STARS to a first-order stationary point is shown, and the expected number of iterations required to reach a desired first-order accuracy is proved to be similar to that of STORM and other stochastic DFO algorithms, up to constants.
Research Organization:
Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)
Sponsoring Organization:
USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR). Scientific Discovery through Advanced Computing (SciDAC)
Grant/Contract Number:
AC02-05CH11231; AC02-06CH11357
OSTI ID:
2480304
Journal Information:
SIAM Journal on Optimization, Journal Name: SIAM Journal on Optimization Journal Issue: 3 Vol. 34; ISSN 1052-6234
Publisher:
Society for Industrial and Applied Mathematics (SIAM)Copyright Statement
Country of Publication:
United States
Language:
English

References (29)

Derivative-Free and Blackbox Optimization book January 2017
VXQR: derivative-free unconstrained optimization based on QR factorizations journal September 2010
Stochastic optimization using a trust-region method and random models journal April 2017
Constrained stochastic blackbox optimization using a progressive barrier and probabilistic estimates journal March 2022
Scalable subspace methods for derivative-free nonlinear least-squares optimization journal June 2022
Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates journal March 2021
Expected complexity analysis of stochastic direct-search journal November 2021
Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization journal March 2023
Stochastic Nelder–Mead simplex method – A new globally convergent direct search method for simulation optimization journal August 2012
Derivative-free optimization methods journal May 2019
Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions journal February 2021
Extensions of Lipschitz mappings into a Hilbert space book January 1984
Topics in Random Matrix Theory book January 2012
The Efficient Generation of Random Orthogonal Matrices with an Application to Condition Estimators journal June 1980
Benchmarking Derivative-Free Optimization Algorithms journal January 2009
Introduction to Derivative-Free Optimization book January 2009
ASTRO-DF: A Class of Adaptive Sampling Trust-Region Algorithms for Derivative-Free Stochastic Optimization journal January 2018
A Stochastic Line Search Method with Expected Complexity Analysis journal January 2020
Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise journal January 2021
A Stochastic Levenberg--Marquardt Method Using Random Models with Complexity Results journal March 2022
Direct Search Based on Probabilistic Descent in Reduced Spaces journal November 2023
Big Omicron and big Omega and big Theta journal April 1976
Estimating Derivatives of Noisy Simulations journal April 2012
Sparser Johnson-Lindenstrauss Transforms journal January 2014
Sharp nonasymptotic bounds on the norm of random matrices with independent entries journal July 2016
Adaptive estimation of a quadratic functional by model selection journal October 2000
Convergence Rate Analysis of a Stochastic Trust-Region Method via Supermartingales journal April 2019
Computational Advertising: Techniques for Targeting Relevant Ads journal January 2014
Stochastic Trust-Region Methods with Trust-Region Radius Depending on Probabilistic Models journal January 2022

Similar Records

A Class of Sparse Johnson–Lindenstrauss Transforms and Analysis of their Extreme Singular Values
Journal Article · Mon Feb 17 19:00:00 EST 2025 · SIAM Journal on Matrix Analysis and Applications · OSTI ID:2563384

A new Krylov-subspace method for symmetric indefinite linear systems
Technical Report · Sat Oct 01 00:00:00 EDT 1994 · OSTI ID:10190810

Derivative-free stochastic optimization via adaptive sampling strategies
Journal Article · Tue Sep 16 20:00:00 EDT 2025 · Optimization Methods and Software · OSTI ID:2998214