Wednesday, April 11, 2001
Lecturer: Richard Martin
Reading:
Thijssen, A.8.2.2
Notes
The Lanczos Method
Method for "Exact Diagonalization" for a few
extreme eigenstates of extremely large matrices
Also "recursion method" that leads to many useful
relations, including frequency dependent
spectra and thermal expectation values
Recall other iterative methods for large matrices
Conjugate Gradient minimization
RMM-DIIS (residual minimization)
Folded spectrum: special case with operator (H-E)2
The Lanczos Method - General properties
Generate basis by multiplication of H times a starting vector
(a Krylov subspace)
Lanczos algorithm leads to an orthonormal basis
in which H is tridiagonal
Any problem can be converted into a
"pseudo-1-dimensional" chain problem
Most useful for finding the few lowest (highest)
eigenvectors of very large, sparse matrices
What is the "catch"? Why not always use Lanczos for
all eigenvectors?
Because errors accumulate. Orthogonality
is guaranteed only for each successive step. For many vectors
orthogonality is gradually lost, and "ghost" states may appear.
Variations on way iteration is performed
Expansion of iterative Krylov space until lowest
eigenstate(s) is converged
Repeated diagonalization of small m x m matrices
until lowest eigenstate(s) is converged
Can be as small as 2 x 2
Very closely related to the RMM-DIIS method!
Correlation functions in the ground state
Shown intrinsic properties of ground state
Also shows properties of excitations
Exponential decay of correlation functions
implies gap in the spectrum
Power law decay implies no gap and special
properties of excitations