Results 1  10
of
13
Preconditioning techniques for large linear systems: A survey
 J. COMPUT. PHYS
, 2002
"... This article surveys preconditioning techniques for the iterative solution of large linear systems, with a focus on algebraic methods suitable for general sparse matrices. Covered topics include progress in incomplete factorization methods, sparse approximate inverses, reorderings, parallelization i ..."
Abstract

Cited by 189 (5 self)
 Add to MetaCart
(Show Context)
This article surveys preconditioning techniques for the iterative solution of large linear systems, with a focus on algebraic methods suitable for general sparse matrices. Covered topics include progress in incomplete factorization methods, sparse approximate inverses, reorderings, parallelization issues, and block and multilevel extensions. Some of the challenges ahead are also discussed. An extensive bibliography completes the paper.
Orderings for incomplete factorization preconditioning of nonsymmetric problems
 SIAM J. SCI. COMPUT
, 1999
"... Numerical experiments are presented whereby the effect of reorderings on the convergence of preconditioned Krylov subspace methods for the solution of nonsymmetric linear systems is shown. The preconditioners used in this study are different variants of incomplete factorizations. It is shown that c ..."
Abstract

Cited by 60 (11 self)
 Add to MetaCart
Numerical experiments are presented whereby the effect of reorderings on the convergence of preconditioned Krylov subspace methods for the solution of nonsymmetric linear systems is shown. The preconditioners used in this study are different variants of incomplete factorizations. It is shown that certain reorderings for direct methods, such as reverse Cuthill–McKee, can be very beneficial. The benefit can be seen in the reduction of the number of iterations and also in measuring the deviation of the preconditioned operator from the identity.
Sparse Approximate Inverse Smoother for Multigrid
 SIAM J. Matrix Anal. Appl
, 1999
"... Various forms of sparse approximate inverses (SAI) have been shown to be useful for preconditioning. Their potential usefulness in a parallel environment has motivated much interest in recent years. However, the capability of an approximate inverse in eliminating the local error has not yet been ful ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
Various forms of sparse approximate inverses (SAI) have been shown to be useful for preconditioning. Their potential usefulness in a parallel environment has motivated much interest in recent years. However, the capability of an approximate inverse in eliminating the local error has not yet been fully exploited in multigrid algorithms. A careful examination of the iteration matrices of these approximate inverses indicates their superiority in smoothing the high frequency error in addition to their inherent parallelism. We propose a new class of sparse approximate inverse smoothers in this paper and present their analytic smoothing factors for constant coecient PDEs. Several distinctive features that make this technique special are: By adjusting the quality of the approximate inverse, the smoothing factor can be improved accordingly. For hard problems, this is useful.
Toward An Effective Sparse Approximate Inverse Preconditioner
 SIAM J. Matrix Anal. Appl
, 1999
"... . Sparse approximate inverse preconditioners have attracted much attention recently, because of their potential usefulness in a parallel environment. In this paper, we explore several performance issues related to e#ective sparse approximate inverse preconditioners (SAIPs) for the matrices derived f ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
(Show Context)
. Sparse approximate inverse preconditioners have attracted much attention recently, because of their potential usefulness in a parallel environment. In this paper, we explore several performance issues related to e#ective sparse approximate inverse preconditioners (SAIPs) for the matrices derived from PDEs. Our refinements can significantly improve the quality of existing SAIPs and/or reduce the cost of computing them. For the test problems from the HarwellBoeing collection and some other applications, the performance of our preconditioners can be comparable or superior to incomplete LU (ILU) preconditioners with similar preconditioning cost. Key words. approximate inverse, globally coupled local inverse, ILU preconditioner, exponential decay AMS subject classifications. 15A09, 15A23, 65F10, 65F50, 65Y05 PII. S0895479897320071 1. Introduction. The use of preconditioned Krylov space methods has been proven to be a competitive solution technique for a wide range of large sparse matrix...
A multilevel dual reordering strategy for robust incomplete LU factorization of indefinite matrices
 SIAM J. Matrix Anal. Appl
, 2001
"... Abstract. A dual reordering strategy based on both threshold and graph reorderings is introduced to construct robust incomplete LU (ILU) factorization of indefinite matrices. The ILU matrix is constructed as a preconditioner for the original matrix to be used in a preconditioned iterative scheme. Th ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
Abstract. A dual reordering strategy based on both threshold and graph reorderings is introduced to construct robust incomplete LU (ILU) factorization of indefinite matrices. The ILU matrix is constructed as a preconditioner for the original matrix to be used in a preconditioned iterative scheme. The matrix is first divided into two parts according to a threshold parameter to control diagonal dominance. The first part with large diagonal dominance is reordered using a graphbased strategy, followed by an ILU factorization. A partial ILU factorization is applied to the second part to yield an approximate Schur complement matrix. The whole process is repeated on the Schur complement matrix and continues for a few times to yield a multilevel ILU factorization. Analyses are conducted to show how the Schur complement approach removes small diagonal elements of indefinite matrices and how the stability of the LU factor affects the quality of the preconditioner. Numerical results are used to compare the new preconditioning strategy with two popular ILU preconditioning techniques and a multilevel block ILU threshold preconditioner.
Anisotropy and Factored Sparse Approximate Inverses
 SIAM J. Sci. Comput
, 1999
"... Abstract. We consider ordering techniques to improve the performance of factored sparse approximate inverse preconditioners, concentrating on the AINV technique of M. Benzi and M. T˚uma. Several practical existing unweighted orderings are considered along with a new algorithm, Minimum Inverse Penalt ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
Abstract. We consider ordering techniques to improve the performance of factored sparse approximate inverse preconditioners, concentrating on the AINV technique of M. Benzi and M. T˚uma. Several practical existing unweighted orderings are considered along with a new algorithm, Minimum Inverse Penalty (MIP), that we propose. We show how good orderings such as these can improve the speed of preconditioner computation dramatically, and also demonstrate a fast and fairly reliable way of testing how good an ordering is in this respect. Our test results also show that these orderings generally improve convergence of Krylov subspace solvers, but may have difficulties particularly for anisotropic problems. We then argue that weighted orderings, which take into account the numerical values in the matrix, will be necessary for such systems. After developing a simple heuristic for dealing with anisotropy we propose several practical algorithms to implement it. While these show promise, we conclude a better heuristic is required for robustness.
Combinatorial problems in solving linear systems
, 2009
"... Numerical linear algebra and combinatorial optimization are vast subjects; as is their interaction. In virtually all cases there should be a notion of sparsity for a combinatorial problem to arise. Sparse matrices therefore form the basis of the interaction of these two seemingly disparate subjects. ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
Numerical linear algebra and combinatorial optimization are vast subjects; as is their interaction. In virtually all cases there should be a notion of sparsity for a combinatorial problem to arise. Sparse matrices therefore form the basis of the interaction of these two seemingly disparate subjects. As the core of many of today’s numerical linear algebra computations consists of the solution of sparse linear system by direct or iterative methods, we survey some combinatorial problems, ideas, and algorithms relating to these computations. On the direct methods side, we discuss issues such as matrix ordering; bipartite matching and matrix scaling for better pivoting; task assignment and scheduling for parallel multifrontal solvers. On the iterative method side, we discuss preconditioning techniques including incomplete factorization preconditioners, support graph preconditioners, and algebraic multigrid. In a separate part, we discuss the block triangular form of sparse matrices.
A Multilevel Algorithm for Reducing the Envelope of Sparse Matrices
, 1996
"... . Envelope methods for solving sparse systems of linear equations require the matrix to be reordered so that the nonzeros are near the diagonal. Optimal reorderings are known to be NPcomplete, but a variety of heuristics have been proposed. In this paper we describe a multilevel approach for findi ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
. Envelope methods for solving sparse systems of linear equations require the matrix to be reordered so that the nonzeros are near the diagonal. Optimal reorderings are known to be NPcomplete, but a variety of heuristics have been proposed. In this paper we describe a multilevel approach for finding small envelope orderings and related ordering problems. We show that our approach generally produces better answers than traditional methods based on a breadth first traversal of the graph of the matrix, while our runtimes are significantly faster than methods which require the computation of the Fiedler vector of the graph (spectral methods). The main competitor to our algorithm would be the recent version of the Sloan algorithm due to Kumfert and Pothen [15], which runs faster and often produces better quality orderings. Our algorithm allows the user to easily adjust the timequality tradeoff by tuning a single parameter. Many sparse matrices arise from the discretization of structur...
Approximate And Incomplete Factorizations
 ICASE/LARC INTERDISCIPLINARY SERIES IN SCIENCE AND ENGINEERING
, 1994
"... In this chapter, we give a brief overview of a particular class of preconditioners known as incomplete factorizations. They can be thought of as approximating the exact LU factorization of a given matrix A (e.g. computed via Gaussian elimination) by disallowing certain fillins. As opposed to other ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
In this chapter, we give a brief overview of a particular class of preconditioners known as incomplete factorizations. They can be thought of as approximating the exact LU factorization of a given matrix A (e.g. computed via Gaussian elimination) by disallowing certain fillins. As opposed to other PDEbased preconditioners such as multigrid and domain decomposition, this class of preconditioners are primarily algebraic in nature and can in principle be applied to any sparse matrices. When applied to PDE problems, they are usually not optimal in the sense that the condition number of the preconditioned system will grow as the mesh size h is reduced, although usually at a slower rate than for the unpreconditioned system. On the other hand, they are often quite robust with respect to other more algebraic features of the problem such as rough and anisotropic coefficients and strong convection terms. We will describe the basic ILU and (modified) MILU preconditioners. Then we will review ...
A structural diagnosis of some IC orderings
 SIAM Journal on Scientific Computing
"... ..."
(Show Context)