Global convergence of the restarted Lanczos and Jacobi–Davidson methods for symmetric eigenvalue problems
详细信息    查看全文
  • 作者:Kensuke Aishima
  • 关键词:65F15 ; 15A18
  • 刊名:Numerische Mathematik
  • 出版年:2015
  • 出版时间:November 2015
  • 年:2015
  • 卷:131
  • 期:3
  • 页码:405-423
  • 全文大小:1,451 KB
  • 参考文献:1.Baglama, J., Calvetti, D., Reichel, L.: Iterative methods for the computation of a few eigenvalues of a large symmetric matrix. BIT 36, 400-21 (1996)MATH MathSciNet CrossRef
    2.Baglama, J., Calvetti, D., Reichel, L.: IRBL: an implicitly restarted block Lanczos method for largescale Hermitian eigenproblems. SIAM J. Sci. Comput. 24, 1650-677 (2003)MATH MathSciNet CrossRef
    3.Bai, Z., Demmel, J., Dongarra, J., Ruhe, A., van der Vorst, H.: Templates for the Solution of Algebraic Eigenvalue Problems: A Practical Guide. SIAM, Philadelphia (2000)CrossRef
    4.Beattie, C., Embree, M., Rossi, J.: Convergence of restarted Krylov subspaces to invariant subspaces. SIAM J. Matrix Anal. Appl. 25, 1074-109 (2004)MATH MathSciNet CrossRef
    5.Beattie, C., Embree, M., Sorensen, D.C.: Convergence of polynomial restart Krylov methods for eigenvalue computations. SIAM Rev. 47, 492-15 (2005)MATH MathSciNet CrossRef
    6.Calvetti, D., Reichel, L., Sorensen, D.C.: An implicitly restarted Lanczos method for large symmetric eigenvalue problems. Electron. Trans. Numer. Anal. 2, 1-1 (1994)MATH MathSciNet
    7.Crouzeix, M., Philippe, B., Sadkane, M.: The Davidson method. SIAM J. Sci. Comput. 15, 62-6 (1994)MATH MathSciNet CrossRef
    8.Cullum, J.K.: The simultaneous computation of a few of the algebraically largest and smallest eigenvalues of a large, symmetric, sparse matrix. BIT 18, 265-75 (1978)MATH MathSciNet CrossRef
    9.Cullum, J.K., Donath, W.E.: A block Lanczos algorithm for computing the \(q\) algebraically largest eigenvalues and a corresponding eigenspace for large, sparse symmetric matrices. In: Proceedings of the 1994 IEEE Conference on Decision and Control, pp. 505-09 IEEE Press, Piscataway, NJ (1974)
    10.Davidson, E.R.: The iterative calculation of a few of the lowest eigenvalues and corresponding eigenvectors of large real-symmetric matrices. J. Comput. Phys. 17, 87-4 (1975)MATH CrossRef
    11.Demmel, J.: Applied Numerical Linear Algebra. SIAM, Philadelphia (1997)MATH CrossRef
    12.Golub, G.H., Underwood, R.: The block Lanczos method for computing eigenvalues. In: Rice, J. (ed.) Mathematical Software III, pp. 364-77. Academic Press, New York (1977)
    13.Golub, G.H., Van Loan, C.F.: Matrix Computations, 4th edition. Johns Hopkins University (2013)
    14.Hochstenbach, M.E., Sleijpen, G.L.G.: Two-sided and alternating Jacobi–Davidson. Linear Algebra Appl. 358, 145-72 (2003)MATH MathSciNet CrossRef
    15.Horn, R.A., Johnson, C.R.: Matrix Analysis, 2nd edition. Cambridge University (2012)
    16.Karush, W.: An iterative method for finding characteristic vectors of a symmetric matrix. Pac. J. Math. 1, 233-48 (1951)MATH MathSciNet CrossRef
    17.Knyazev, A.V.: Toward the optimal preconditioned eigensolver: locally optimal block preconditioned conjugate gradient method. SIAM J. Sci. Comput. 23, 517-41 (2001)MATH MathSciNet CrossRef
    18.Knyazev, A.V., Skorokhodov, A.L.: On exact estimates of the convergence rate of the steepest ascent method in the symmetric eigenvalue problem. Linear Algebra Appl. 154-56, 245-57 (1991)MathSciNet CrossRef
    19.Lanczos, C.: An iteration method for the solution of the eigenvalue problem of linear differential and integral operators. J. Res. Nat. Bur. Stand. 45, 255-82 (1950)MathSciNet CrossRef
    20.Li, R.C.: Sharpness in rates of convergence for the symmetric Lanczos method. Math. Comput. 79, 419-35 (2010)MATH CrossRef
    21.Morgan, R.B., Scott, D.S.: Generalizations of Davidson’s method for computing eigenvalues of sparse symmetric matrices. SIAM J. Sci. Stat. Comput. 7, 817-25 (1986)MATH MathSciNet CrossRef
    22.Notay, Y.: Is Jacobi–Davidson faster than Davidson? SIAM J. Matrix Anal. Appl. 26, 522-43 (2005)MATH MathSciNet CrossRef
    23.Ovtchinnikov, E.: Convergence estimates for the generalized Davidson method for symmetric eigenvalue problems I: the preconditioning aspect. SIAM J. Numer. Anal. 41, 258-71 (2003)MATH MathSciNet CrossRef
    24.Ovtchinnikov, E.: Convergence estimates for the generalized Davidson method for symmetric eigenvalue problems II: the preconditioning aspect. SIAM J. Numer. Anal. 41, 272-86 (2003)MATH MathSciNet CrossRef
    25.Parlett, B.N.: The Symmetric Eigenvalue Problem. Prentice-Hall, Englewood Cliffs (1980)MATH
    26.Ruhe, A.: Implementation aspects of band Lanczos algorithms for computation of eigenvalues of large sparse symmetric matrices. Math. Comput. 33, 680-87 (1979)MATH MathSciNet CrossRef
    27.Sleijpen, G.L.G., van der Vorst, A.: A Jacobi-Davidson iteration method for linear eigenvalue problems. SIAM J. Matrix Anal. Appl. 17, 401-25 (1996)MATH MathSciNet CrossRef
    28.Sorensen, D.C.: Implicit application of polynomial filters in a \(k\) -step Arnoldi method. SIAM J. Matrix Anal. Appl. 13, 357-85 (1992)MATH MathSciNet CrossRef
    29.Stathopoulos, A.: Nearly optimal preconditioned methods for Hermitian eigenproblems under limited memory. Part I: seeking one eigenvalue. SIAM J. Sci. Comput. 29, 48
  • 作者单位:Kensuke Aishima (1)

    1. The University of Tokyo, Hongo 7-3-1, Bunkyo-ku, Tokyo, Japan
  • 刊物类别:Mathematics and Statistics
  • 刊物主题:Mathematics
    Numerical Analysis
    Mathematics
    Mathematical and Computational Physics
    Mathematical Methods in Physics
    Numerical and Computational Methods
    Applied Mathematics and Computational Methods of Engineering
  • 出版者:Springer Berlin / Heidelberg
  • ISSN:0945-3245
文摘
The Lanczos method is well known for computing the extremal eigenvalues of symmetric matrices. For efficiency and robustness a restart strategy is employed in practice, but this makes an analysis of convergence less straightforward. We prove global convergence of the restarted Lanczos method in exact arithmetic using certain convergence properties of the Rayleigh–Ritz procedure, which can be obtained from the discussion by Crouzeix, Philippe, and Sadkane. For the restarted Lanczos method, Sorensen’s previous analysis establishes global convergence to the largest eigenvalues under the technical assumption that the absolute values of the off-diagonal elements of the Lanczos tridiagonal matrix are larger than a positive constant throughout the iterations. In this paper, we prove global convergence without any such assumption. The only assumption is that the initial vector is not orthogonal to any of the target exact eigenvectors. More importantly, our results are applicable to dynamic restarting procedures where the dimensions of the projection subspace are dynamically determined. In other words, our analysis can be applied to the recently proposed efficient restart strategies employed in the thick restarted Lanczos method. The convergence theorem is extended to the restarted Lanczos method for computing both the largest and smallest eigenvalues. Moreover, we derive certain global convergence theorems of the block Lanczos and Jacobi–Davidson methods, where, for both algorithms, the Ritz values are shown to converge to exact eigenvalues, although they are not necessarily extremal. Mathematics Subject Classification 65F15 15A18

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700