Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Convergence analysis of Anderson-type acceleration of Richardson's iteration

Journal Article · · Numerical Linear Algebra with Applications
DOI:https://doi.org/10.1002/nla.2241· OSTI ID:1511931
 [1]
  1. Emory Univ., Atlanta, GA (United States). Dept. of Mathematics and Computer Science; Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). National Center for Computational Sciences

We consider here Anderson extrapolation to accelerate the (stationary) Richardson iterative method for sparse linear systems. Using an Anderson mixing at periodic intervals, we assess how this benefits convergence to a prescribed accuracy. The method, named alternating Anderson–Richardson, has appealing properties for high-performance computing, such as the potential to reduce communication and storage in comparison to more conventional linear solvers. We establish sufficient conditions for convergence, and we evaluate the performance of this technique in combination with various preconditioners through numerical examples. Furthermore, we propose an augmented version of this technique.

Research Organization:
Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Sponsoring Organization:
USDOE Office of Science (SC)
Grant/Contract Number:
AC05-00OR22725
OSTI ID:
1511931
Journal Information:
Numerical Linear Algebra with Applications, Journal Name: Numerical Linear Algebra with Applications Journal Issue: 4 Vol. 26; ISSN 1070-5325
Publisher:
WileyCopyright Statement
Country of Publication:
United States
Language:
English

References (19)

Two classes of multisecant methods for nonlinear acceleration journal March 2009
Preconditioning Techniques for Large Linear Systems: A Survey journal November 2002
Chebyshev semi-iterative methods, successive overrelaxation iterative methods, and second order Richardson iterative methods: Part II journal December 1961
The Tchebychev iteration for nonsymmetric linear systems journal September 1977
Chebyshev semi-iterative methods, successive overrelaxation iterative methods, and second order Richardson iterative methods: Part I journal December 1961
Benchmarking optimization software with performance profiles journal January 2002
Alternating Anderson–Richardson method: An efficient alternative to preconditioned Krylov methods for large, sparse linear systems journal January 2019
Anderson acceleration of the Jacobi iterative method: An efficient alternative to Krylov methods for large, sparse linear systems journal February 2016
A characterization of the behavior of the Anderson acceleration on linear problems journal February 2013
Hierarchical Krylov and nested Krylov methods for extreme-scale computing journal January 2014
Domain Decomposition Preconditioners for Communication-Avoiding Krylov Methods on a Hybrid CPU/GPU Cluster
  • Yamazaki, Ichitaro; Rajamanickam, Sivasankaran; Boman, Erik G.
  • SC14: International Conference for High Performance Computing, Networking, Storage and Analysis https://doi.org/10.1109/SC.2014.81
conference November 2014
GMRES: A Generalized Minimal Residual Algorithm for Solving Nonsymmetric Linear Systems journal July 1986
Iterative Methods for Sparse Linear Systems book January 2003
Anderson Acceleration for Fixed-Point Iterations journal January 2011
Convergence Analysis for Anderson Acceleration journal January 2015
Numerical Analysis of Fixed Point Algorithms in the Presence of Hardware Faults journal January 2015
Iterative Procedures for Nonlinear Integral Equations journal October 1965
Reducing the bandwidth of sparse symmetric matrices conference January 1969
Benchmarking Optimization Software with Performance Profiles text January 2001