Results 1  10
of
145
QMR: a QuasiMinimal Residual Method for NonHermitian Linear Systems
, 1991
"... ... In this paper, we present a novel BCGlike approach, the quasiminimal residual (QMR) method, which overcomes the problems of BCG. An implementation of QMR based on a lookahead version of the nonsymmetric Lanczos algorithm is proposed. It is shown how BCG iterates can be recovered stably from t ..."
Abstract

Cited by 334 (26 self)
 Add to MetaCart
... In this paper, we present a novel BCGlike approach, the quasiminimal residual (QMR) method, which overcomes the problems of BCG. An implementation of QMR based on a lookahead version of the nonsymmetric Lanczos algorithm is proposed. It is shown how BCG iterates can be recovered stably from the QMR process. Some further properties of the QMR approach are given and an error bound is presented. Finally, numerical experiments are reported.
Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods
, 1994
"... This document is the electronic version of the 2nd edition of the Templates book, which is available for purchase from the Society for Industrial and Applied ..."
Abstract

Cited by 170 (5 self)
 Add to MetaCart
This document is the electronic version of the 2nd edition of the Templates book, which is available for purchase from the Society for Industrial and Applied
Iterative Solution of Linear Systems
 Acta Numerica
, 1992
"... this paper is as follows. In Section 2, we present some background material on general Krylov subspace methods, of which CGtype algorithms are a special case. We recall the outstanding properties of CG and discuss the issue of optimal extensions of CG to nonHermitian matrices. We also review GMRES ..."
Abstract

Cited by 100 (8 self)
 Add to MetaCart
this paper is as follows. In Section 2, we present some background material on general Krylov subspace methods, of which CGtype algorithms are a special case. We recall the outstanding properties of CG and discuss the issue of optimal extensions of CG to nonHermitian matrices. We also review GMRES and related methods, as well as CGlike algorithms for the special case of Hermitian indefinite linear systems. Finally, we briefly discuss the basic idea of preconditioning. In Section 3, we turn to Lanczosbased iterative methods for general nonHermitian linear systems. First, we consider the nonsymmetric Lanczos process, with particular emphasis on the possible breakdowns and potential instabilities in the classical algorithm. Then we describe recent advances in understanding these problems and overcoming them by using lookahead techniques. Moreover, we describe the quasiminimal residual algorithm (QMR) proposed by Freund and Nachtigal (1990), which uses the lookahead Lanczos process to obtain quasioptimal approximate solutions. Next, a survey of transposefree Lanczosbased methods is given. We conclude this section with comments on other related work and some historical remarks. In Section 4, we elaborate on CGNR and CGNE and we point out situations where these approaches are optimal. The general class of Krylov subspace methods also contains parameterdependent algorithms that, unlike CGtype schemes, require explicit information on the spectrum of the coefficient matrix. In Section 5, we discuss recent insights in obtaining appropriate spectral information for parameterdependent Krylov subspace methods. After that, 4 R.W. Freund, G.H. Golub and N.M. Nachtigal
A restarted GMRES method augmented with eigenvectors
 SIAM J. Matrix Anal. Appl
, 1995
"... Abstract. The GMRES method for solving nonsymmetric linear equations is generally used with restarting to reduce storage and orthogonalization costs. Restarting slows down the convergence. However, it is possible to save some important information at the time of the restart. It is proposed that appr ..."
Abstract

Cited by 77 (9 self)
 Add to MetaCart
Abstract. The GMRES method for solving nonsymmetric linear equations is generally used with restarting to reduce storage and orthogonalization costs. Restarting slows down the convergence. However, it is possible to save some important information at the time of the restart. It is proposed that approximate eigenvectors corresponding to a few of the smallest eigenvalues be formed and added to the subspace for GMRES. The convergence can be much faster, and the minimum residual property is retained. Key words. GMRES, conjugate gradient, Krylov subspaces, iterative methods, nonsymmetric systems AMS subject classifications. 65F15, 15A18
Efficient spectralGalerkin methods III. Polar and cylindrical geometries
 SIAM J. Sci. Comput
, 1995
"... Abstract. Efficient direct solvers based on the ChebyshevGalerkin methods for second and fourth order equations are presented. They are based on appropriate base functions for the Galerkin formulation which lead to discrete systems with special structured matrices which can be efficiently inverted. ..."
Abstract

Cited by 72 (33 self)
 Add to MetaCart
Abstract. Efficient direct solvers based on the ChebyshevGalerkin methods for second and fourth order equations are presented. They are based on appropriate base functions for the Galerkin formulation which lead to discrete systems with special structured matrices which can be efficiently inverted. Numerical results indicate that the direct solvers presented in this paper are significantly more accurate and efficient than that based on the Chebyshevtau method.
GMRESR: A family of nested GMRES methods
 Num. Lin. Alg. with Appl
, 1991
"... Recently Eirola and Nevanlinna have proposed an iterative solution method for unsymmetric linear systems, in which the preconditioner is updated from step to step. Following their ideas we suggest variants of GMRES, in which a preconditioner is constructed at each iteration step by a suitable approx ..."
Abstract

Cited by 58 (16 self)
 Add to MetaCart
Recently Eirola and Nevanlinna have proposed an iterative solution method for unsymmetric linear systems, in which the preconditioner is updated from step to step. Following their ideas we suggest variants of GMRES, in which a preconditioner is constructed at each iteration step by a suitable approximation process, e.g., by GMRES itself. Keywords: GMRES, nonsymmetric linear systems, iterative solver, ENmethod This version is dated June 23, 1992 Introduction The GMRES method, proposed in [13], is a popular method for the iterative solution of sparse linear systems with an unsymmetric nonsingular matrix. In its original form, socalled full GMRES, it is optimal in the sense that it minimizes the residual over the current Krylov subspace. However, it is often too expensive since the required orthogonalization per iteration step grows quadratically with the number of steps. For that reason, one often uses in practice variants of GMRES. The most wellknown variant, already suggested i...
Recent computational developments in Krylov subspace methods for linear systems
 NUMER. LINEAR ALGEBRA APPL
, 2007
"... Many advances in the development of Krylov subspace methods for the iterative solution of linear systems during the last decade and a half are reviewed. These new developments include different versions of restarted, augmented, deflated, flexible, nested, and inexact methods. Also reviewed are metho ..."
Abstract

Cited by 48 (12 self)
 Add to MetaCart
Many advances in the development of Krylov subspace methods for the iterative solution of linear systems during the last decade and a half are reviewed. These new developments include different versions of restarted, augmented, deflated, flexible, nested, and inexact methods. Also reviewed are methods specifically tailored to systems with special properties such as special forms of symmetry and those depending on one or more parameters.
Solution of Shifted Linear Systems by QuasiMinimal Residual Iterations
 in Numerical Linear Algebra
, 1993
"... Highorder implicit methods for solving timedependent partial differential equations and frequency response computations in control theory give rise to shifted systems of linear equations. Such systems have identical righthand sides, and their coefficient matrices differ from each other only by sc ..."
Abstract

Cited by 41 (4 self)
 Add to MetaCart
Highorder implicit methods for solving timedependent partial differential equations and frequency response computations in control theory give rise to shifted systems of linear equations. Such systems have identical righthand sides, and their coefficient matrices differ from each other only by scalar multiples of the identity matrix. This paper explores the use of two quasiminimal residual iterations, the QMR and the TFQMR algorithm, for the solution of such shifted linear systems. It is shown that both algorithms can exploit the special structure, and that, for any family of shifted linear systems, the number of matrixvector products and the number of inner products is the same as for a single linear system. Convergence results for the QMR and TFQMR algorithms are presented. This research was performed at the Research Institute for Advanced Computer Science (RIACS), NASA Ames Research Center, Moffett Field, California 94035, and it was supported by Cooperative Agreement NCC 238...
Globalized NewtonKrylovSchwarz algorithms and software for parallel implicit CFD
 Int. J. High Performance Computing Applications
, 1998
"... Key words. NewtonKrylovSchwarz algorithms, parallel CFD, implicit methods Abstract. Implicit solution methods are important in applications modeled by PDEs with disparate temporal and spatial scales. Because such applications require high resolution with reasonable turnaround, parallelization is e ..."
Abstract

Cited by 36 (14 self)
 Add to MetaCart
Key words. NewtonKrylovSchwarz algorithms, parallel CFD, implicit methods Abstract. Implicit solution methods are important in applications modeled by PDEs with disparate temporal and spatial scales. Because such applications require high resolution with reasonable turnaround, parallelization is essential. The pseudotransient matrixfree NewtonKrylovSchwarz (ΨNKS) algorithmic framework is presented as a widely applicable answer. This article shows that, for the classical problem of threedimensional transonic Euler flow about an M6 wing, ΨNKS can simultaneously deliver • globalized, asymptotically rapid convergence through adaptive pseudotransient continuation and Newton’s method; • reasonable parallelizability for an implicit method through deferred synchronization and favorable communicationtocomputation scaling in the Krylov linear solver; and • high perprocessor performance through attention to distributed memory and cache locality, especially through the Schwarz preconditioner. Two discouraging features of ΨNKS methods are their sensitivity to the coding of the underlying PDE discretization and the large number of parameters that must be selected to govern convergence. We therefore distill several recommendations from our experience and from our reading of the literature on various algorithmic components of ΨNKS, and we describe a freely available, MPIbased portable parallel software implementation of the solver employed here. 1. Introduction. Disparate
From Potential Theory To Matrix Iterations In Six Steps
 SIAM REVIEW
"... The theory of the convergence of Krylov subspace iterations for linear systems of equations (conjugate gradients, biconjugate gradients, GMRES, QMR, BiCGSTAB, ...) is reviewed. For a computation of this kind, an estimated asymptotic convergence factor ae 1 can be derived by solving a problem of pot ..."
Abstract

Cited by 35 (4 self)
 Add to MetaCart
The theory of the convergence of Krylov subspace iterations for linear systems of equations (conjugate gradients, biconjugate gradients, GMRES, QMR, BiCGSTAB, ...) is reviewed. For a computation of this kind, an estimated asymptotic convergence factor ae 1 can be derived by solving a problem of potential theory or conformal mapping. Six approximations are involved in relating the actual computation to this scalar estimate. These six approximations are discussed in a systematic way and illustrated by a sequence of examples computed with tools of numerical conformal mapping and semidefinite programming.