Back to Search
Start Over
Momentum-Accelerated Richardson(m) and Their Multilevel Neural Solvers
- Publication Year :
- 2024
-
Abstract
- Recently, designing neural solvers for large-scale linear systems of equations has emerged as a promising approach in scientific and engineering computing. This paper first introduce the Richardson(m) neural solver by employing a meta network to predict the weights of the long-step Richardson iterative method. Next, by incorporating momentum and preconditioning techniques, we further enhance convergence. Numerical experiments on anisotropic second-order elliptic equations demonstrate that these new solvers achieve faster convergence and lower computational complexity compared to both the Chebyshev iterative method with optimal weights and the Chebyshev semi-iteration method. To address the strong dependence of the aforementioned single-level neural solvers on PDE parameters and grid size, we integrate them with two multilevel neural solvers developed in recent years. Using alternating optimization techniques, we construct Richardson(m)-FNS for anisotropic equations and NAG-Richardson(m)-WANS for the Helmholtz equation. Numerical experiments show that these two multilevel neural solvers effectively overcome the drawback of single-level methods, providing better robustness and computational efficiency.
- Subjects :
- Mathematics - Numerical Analysis
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2412.08076
- Document Type :
- Working Paper