Results 1 
7 of
7
Implicitly restarted Arnoldi/Lanczos Methods for Large Scale Eigenvalue Calculations
, 1996
"... Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new m ..."
Abstract

Cited by 34 (3 self)
 Add to MetaCart
Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of largescale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for largescale nonsymmetric problems was virtually nonexistent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of largescale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The wellknown Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.
Matrix transformations for computing rightmost eigenvalues of large sparse nonsymmetric eigenvalue problems
 IMA J. Numer. Anal
, 1996
"... This paper gives an overview of matrix transformations for finding rightmost eigenvalues of Ax = kx and Ax = kBx with A and B real nonsymmetric and B possibly singular. The aim is not to present new material, but to introduce the reader to the application of matrix transformations to the solution o ..."
Abstract

Cited by 33 (7 self)
 Add to MetaCart
This paper gives an overview of matrix transformations for finding rightmost eigenvalues of Ax = kx and Ax = kBx with A and B real nonsymmetric and B possibly singular. The aim is not to present new material, but to introduce the reader to the application of matrix transformations to the solution of largescale eigenvalue problems. The paper explains and discusses the use of Chebyshev polynomials and the shiftinvert and Cayley ^ transforms as matrix transformations for problems that arise from the discretization df partial differential equations. A few other techniques are described. The reliability of iterative methods is also dealt with by introducing the concept of domain of confidence or trust region. This overview gives the reader an idea of the benefits and the drawbacks of several transformation techniques. We also briefly discuss the current software
CRPC Research into Linear Algebra Software for High Performance Computers
, 1994
"... In this paper we look at a number of approaches being investigated in the Center for Research on Parallel Computation (CRPC) to develop linear algebra software for highperformance computers. These approaches are exemplified by the LAPACK, templates, and ARPACK projects. LAPACK is a software library ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
In this paper we look at a number of approaches being investigated in the Center for Research on Parallel Computation (CRPC) to develop linear algebra software for highperformance computers. These approaches are exemplified by the LAPACK, templates, and ARPACK projects. LAPACK is a software library for performing dense and banded linear algebra computations, and was designed to run efficiently on high performance computers. We focus on the design of the distributed memory version of LAPACK, and on an objectoriented interface to LAPACK. The templates project aims at making the task of developing sparse linear algebra software simpler and easier. Reusable software templates are provided that the user can then customize to modify and optimize a particular algorithm, and hence build a more complex applications. ARPACK is a software package for solving large scale eigenvalue problems, and is based on an implicitly restarted variant of the Arnoldi scheme. The paper focuses on issues impact...
JacobiDavidson Methods for Generalized MHDEigenvalue Problems
, 1996
"... this paper the emphasis is put on the case where one of the matrices, say the Bmatrix, is Hermitian positive definite. The method is an innerouter iterative scheme, in which the inner iteration process consists of solving linear systems to some accuracy. The factorization of either matrix is avoid ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
this paper the emphasis is put on the case where one of the matrices, say the Bmatrix, is Hermitian positive definite. The method is an innerouter iterative scheme, in which the inner iteration process consists of solving linear systems to some accuracy. The factorization of either matrix is avoided. Numerical experiments are presented for problems arising in magnetohydrodynamics (MHD). 1. Introduction and notation
A preconditioned JacobiDavidson method for solving large generalized eigenvalue problems
, 1994
"... In this paper we apply the recently proposed JacobiDavidson method for calculating extreme eigenvalues of large matrices to a generalized eigenproblem. This leads to an algorithm that computes the extreme eigensolutions of a matrix pencil (A; B), where A and B are general matrices. Factorization of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we apply the recently proposed JacobiDavidson method for calculating extreme eigenvalues of large matrices to a generalized eigenproblem. This leads to an algorithm that computes the extreme eigensolutions of a matrix pencil (A; B), where A and B are general matrices. Factorization of either of them is avoided. Instead we need to solve two linear systems with sufficient, but modest accuracy. If both linear systems are solved accurately enough, an asymptotically quadratic speed of convergence can be achieved. Interior eigenvalues in the vicinity of a given complex number oe can be computed without factorization as well. We illustrate the procedure with a few numerical examples, one of them being an application in magnetohydrodynamics. AMS Subject Classification (1991): 65F15 CR Subject Classification (1991): G.1.3 Keywords & Phrases: Eigenvalues, eigenvectors, matrix pairs, JacobiDavidson method, GMRES, preconditioner, magnetohydrodynamics 1. Introduction The numeric...
JacobiDavidson Methods for Generalized MHDEigenvalue Problems
"... this paper the emphasis is put on the case where one of the matrices, say the Bmatrix, is Hermitian positive definite. The method is an innerouter iterative scheme, in which the inner iteration process consists of solving linear systems to some accuracy. The factorization of either matrix is avoid ..."
Abstract
 Add to MetaCart
this paper the emphasis is put on the case where one of the matrices, say the Bmatrix, is Hermitian positive definite. The method is an innerouter iterative scheme, in which the inner iteration process consists of solving linear systems to some accuracy. The factorization of either matrix is avoided. Numerical experiments are presented for problems arising in magnetohydrodynamics (MHD). 1. Introduction and notation
, and H.J.J. te Riele
"... A preconditioned JacobiDavidson method for solving large generalized ..."
(Show Context)