Results 1 
5 of
5
Preconditioning techniques for large linear systems: A survey
 J. COMPUT. PHYS
, 2002
"... This article surveys preconditioning techniques for the iterative solution of large linear systems, with a focus on algebraic methods suitable for general sparse matrices. Covered topics include progress in incomplete factorization methods, sparse approximate inverses, reorderings, parallelization i ..."
Abstract

Cited by 101 (4 self)
 Add to MetaCart
This article surveys preconditioning techniques for the iterative solution of large linear systems, with a focus on algebraic methods suitable for general sparse matrices. Covered topics include progress in incomplete factorization methods, sparse approximate inverses, reorderings, parallelization issues, and block and multilevel extensions. Some of the challenges ahead are also discussed. An extensive bibliography completes the paper.
Differences in the effects of rounding errors in Krylov solvers for symmetric indefinite linear systems
, 1999
"... The 3term Lanczos process leads, for a symmetric matrix, to bases for Krylov subspaces of increasing dimension. The Lanczos basis, together with the recurrence coefficients, can be used for the solution of symmetric indefinite linear systems, by solving the reduced system in one way or another. Thi ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
The 3term Lanczos process leads, for a symmetric matrix, to bases for Krylov subspaces of increasing dimension. The Lanczos basis, together with the recurrence coefficients, can be used for the solution of symmetric indefinite linear systems, by solving the reduced system in one way or another. This leads to wellknown methods: MINRES, GMRES, and SYMMLQ. We will discuss in what way and to what extent these approaches differ in their sensitivity to rounding errors. In our analysis we will assume that the Lanczos basis is generated in exactly the same way for the different methods, and we will not consider the errors in the Lanczos process itself. We will show that the method of solution may lead, under certain circumstances, to large additional errors, that are not corrected by continuing the iteration process. Our findings are supported and illustrated by numerical examples. 1 Introduction We will consider iterative methods for the construction of approximate solutions, starting with...
Generalizations and Modifications of the GMRES Iterative Method
, 1997
"... We are interested in iterative methods for solving systems of linear equations of the form Au = b, where A is a large sparse nonsingular matrix. When A is symmetric positive definite, conjugategradienttype methods are often used since they are fairly well understood. On the other hand, when A is no ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We are interested in iterative methods for solving systems of linear equations of the form Au = b, where A is a large sparse nonsingular matrix. When A is symmetric positive definite, conjugategradienttype methods are often used since they are fairly well understood. On the other hand, when A is nonsymmetric, the choice of an effective iterative method is much more difficult. For solving nonsymmetric linear systems, the wellknown GMRES method is considered to be a stable method; however, the work per iteration increases as the number of iterations increases. We consider two new iterative methods GGMRES and MGMRES, which are a generalization and a modification of the GMRES method, respectively. Instead of using a minimization condition as in GGMRES, we use a Galerkin condition to derive the MGMRES method. We also introduce another new iterative method LAN/MGMRES, which is designed to combine the reliability of GMRES with the reduced work of a Lanczostype method. A computer program has...
Spectral Properties by Using Splitting Correction Preconditioner for Linear Systems that Arise from Periodic Boundary Problems
, 2000
"... In this paper, the spectral properties of the preconditioned systems by the "Splitting Correction (SC)", proposed by the present authors, are studied and it is conjectured that the degeneracy not the clustering of the eiganvalues plays an important role in the convergence. The SC preconditioner is o ..."
Abstract
 Add to MetaCart
In this paper, the spectral properties of the preconditioned systems by the "Splitting Correction (SC)", proposed by the present authors, are studied and it is conjectured that the degeneracy not the clustering of the eiganvalues plays an important role in the convergence. The SC preconditioner is one of new preconditioners based on block factorization for solving linear systems that arise from periodic boundary problems. From the viewpoint of the convergence of residual norm, the conjugate gradient (CG) method using the SC is faster than using conventional preconditioner, block incomplete Cholesky (block IC) factorization. Furthermore, the behaviors of the residual norm of the CG method preconditioned by the SC and the block IC are very peculiar. Generally, the convergence of the CG method depends on spectral properties, such as the clustering and the degeneracy of the eigenvalues, of the coe#cient matrix. For symmetric linear system that arises from periodic boundary problems, the eigenvalue distribution and the condition number of the coe#cient matrix are evaluated. These numerical results suggest that the fast convergence of the SC is due not to the clustering but to the degeneracy of the eigenvalues of the preconditioned coe#cient matrix. Keywords block preconditioning, the ShermanMorrison formula, rank correction, spectral property, conjugate gradient method. I.
Static Condensation, Partial Orthogonalization of Basis Functions, and ILU Preconditioning in hpFEM
"... Static condensation of internal degrees of freedom, partial orthogonalization of basis functions, and ILU preconditioning are techniques used to facilitate the solution of discrete problems obtained in the hpFEM. This paper shows that for a wide class of symmetric (not necessarily positivedefinite ..."
Abstract
 Add to MetaCart
Static condensation of internal degrees of freedom, partial orthogonalization of basis functions, and ILU preconditioning are techniques used to facilitate the solution of discrete problems obtained in the hpFEM. This paper shows that for a wide class of symmetric (not necessarily positivedefinite) linear problems, these three techniques are completely equivalent. In fact, the same matrices can be obtained by the the same arithmetic operations. The study can be extended to nonsymmetric problems naturally.