Alternating Anderson Acceleration-like Algorithms for Solving Large Sparse Linear Systems

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

The increasing scale of scientific and engineering computations presents significant challenges in efficiently solving large sparse linear systems. In this work, we develop a class of enhanced iterative solvers by integrating Anderson Acceleration with the Hermitian and skew-Hermitian splitting (HSS) framework. Specifically, we propose the alternating Anderson-accelerated HSS (AAHSS) method, which incorporates the weighted HSS iterations and periodically applies Anderson Acceleration to improve convergence. A spectral radius-based convergence analysis is conducted to theoretically support the proposed strategy. Building upon the AAHSS method, we further design its four algorithmic variants: the single-step Anderson-accelerated HSS (AASHSS) method, its preconditioned version (AASPHSS), a parameterized single-step variant (AAPSHSS), and the parameterized preconditioned single-step (AAPSPHSS) methods. Comprehensive numerical experiments on a range of benchmark problems demonstrate that all proposed methods significantly reduce iteration counts and improve computational efficiency compared to classical Krylov subspace methods, such as the alternating Anderson-accelerated Richardson (AAR), GMRES, BiCGStab, Quasi-Minimal Residual (QMR) and CG methods. Among them, the AAPSHSS method performs particularly well on nonsymmetric systems, while the AAPSPHSS method is especially effective for symmetric problems.

Related articles

Related articles are currently not available for this article.