A generalization of -step variants of gradient methods
详细信息查看全文 | 推荐本文 |
摘要
The -step methods were proposed by Chronopoulos to gain efficiency in parallel programming of iterative methods for linear systems. They are variants of classical iterative methods based on the construction of a Krylov subspace basis on each iteration. These s-step methods were inferred from algorithms like the Conjugate Gradient, Generalized Conjugate Residual or the Minimal Residual. They converge for all symmetric, nonsymmetric definite and some nonsymmetric indefinite matrices. In this paper, we introduce an -step variant of a General Orthogonalization Algorithm, that is, a generalization of s-step variants of gradient methods. We prove convergence and obtain error estimates. We also describe an Orthomin variant, together with a convergence theorem. From this we derive the well known -step methods as particular cases, and some which are newfound to our knowledge. This provides a unified framework to derive and study -step methods. Some of the methods obtained are convergent for every nonsingular matrix. Finally, we give some numerical results for the new proposed methods, showing that the parallel implementation of these overcomes the original ones.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700