A Case for a Biorthogonal Jacobi--Davidson Method: Restarting and Correction Equation

Andreas Stathopoulos
2002
2 references

Abstract

We propose a biorthogonal Jacobi--Davidson method (biJD), which can be viewed as an explicitly biorthogonalized, restarted Lanczos method, that uses the approximate solution of a correction equation to expand its basis. Through an elegant formulation, the algorithm allows for all the functionalities and features of the Jacobi--Davidson method (JD), but it also includes some of the advantages of nonsymmetric Lanczos. The main motivation for this work stems from a correction equation and a restarting scheme that are possible with biJD but not with JD. Specifically, a correction equation using the left approximate eigenvectors available in biJD yields cubic asymptotic convergence, as opposed to quadratic with the JD correction equation. In addition, a successful restarting scheme for symmetric JD depends on the Lanczos three-term recurrence and thus can only apply to biJD. Finally, methods that require a multiplication with the adjoint of the matrix need to be reconsidered on today's computers with memory hierarchies, as this multiplication can be performed with minimal additional cost. We describe the algorithm, its features, and the possible functionalities. In addition, we develop an appropriate correction equation framework and analyze the effects of the new restarting scheme. Our numerical experiments confirm that biJD is a highly competitive method for a difficult problem.

1 repository
1 reference

Code References

pytorch/pytorch
1 file
torch/_lobpcg.py
1
L373 selection [StathopoulosEtal2002]. A robust method.
Link copied to clipboard!