Results 1  10
of
35
Numerical solution of saddle point problems
 ACTA NUMERICA
, 2005
"... Large linear systems of saddle point type arise in a wide variety of applications throughout computational science and engineering. Due to their indefiniteness and often poor spectral properties, such linear systems represent a significant challenge for solver developers. In recent years there has b ..."
Abstract

Cited by 180 (30 self)
 Add to MetaCart
Large linear systems of saddle point type arise in a wide variety of applications throughout computational science and engineering. Due to their indefiniteness and often poor spectral properties, such linear systems represent a significant challenge for solver developers. In recent years there has been a surge of interest in saddle point problems, and numerous solution techniques have been proposed for solving this type of systems. The aim of this paper is to present and discuss a large selection of solution methods for linear systems in saddle point form, with an emphasis on iterative methods for large and sparse problems.
Approximate factorization constraint preconditioners for saddlepoint matrices
 SIAM J. Sci. Comput
"... Abstract. We consider the application of the conjugate gradient method to the solution of large, symmetric indefinite linear systems. Special emphasis is put on the use of constraint preconditioners and a new factorization that can reduce the number of flops required by the preconditioning step. Res ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Abstract. We consider the application of the conjugate gradient method to the solution of large, symmetric indefinite linear systems. Special emphasis is put on the use of constraint preconditioners and a new factorization that can reduce the number of flops required by the preconditioning step. Results concerning the eigenvalues of the preconditioned matrix and its minimum polynomial are given. Numerical experiments validate these conclusions.
Inexact Constraint Preconditioners for Linear Systems Arising in Interior Point Methods
, 2005
"... Abstract. Issues of indefinite preconditioning of reduced Newton systems arising in optimization with interior point methods are addressed in this paper. Constraint preconditioners have shown much promise in this context. However, there are situations in which an unfavorable sparsity pattern of Jaco ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
Abstract. Issues of indefinite preconditioning of reduced Newton systems arising in optimization with interior point methods are addressed in this paper. Constraint preconditioners have shown much promise in this context. However, there are situations in which an unfavorable sparsity pattern of Jacobian matrix may adversely affect the preconditioner and make its inverse representation unacceptably dense hence too expensive to be used in practice. A remedy to such situations is proposed in this paper. An approximate constraint preconditioner is considered in which sparse approximation of the Jacobian is used instead of the complete matrix. Spectral analysis of the preconditioned matrix is performed and bounds on its nonunit eigenvalues are provided. Preliminary computational results are encouraging. Keywords Interiorpoint methods, Iterative solvers, Preconditioners, Approximate Jacobian.
Iterative solution of augmented systems arising in interior methods
 SIAM Journal on Optimization
"... Abstract. Iterative methods are proposed for certain augmented systems of linear equations that arise in interior methods for general nonlinear optimization. Interior methods define a sequence of KKT equations that represent the symmetrized (but indefinite) equations associated with Newton’s method ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Abstract. Iterative methods are proposed for certain augmented systems of linear equations that arise in interior methods for general nonlinear optimization. Interior methods define a sequence of KKT equations that represent the symmetrized (but indefinite) equations associated with Newton’s method for a point satisfying the perturbed optimality conditions. These equations involve both the primal and dual variables and become increasingly illconditioned as the optimization proceeds. In this context, an iterative linear solver must not only handle the illconditioning but also detect the occurrence of KKT matrices with the wrong matrix inertia. A oneparameter family of equivalent linear equations is formulated that includes the KKT system as a special case. The discussion focuses on a particular system from this family, known as the “doubly augmented system, ” that is positive definite with respect to both the primal and dual variables. This property means that a standard preconditioned conjugategradient method involving both primal and dual variables will either terminate successfully or detect if the KKT matrix has the wrong inertia. Constraint preconditioning is a wellknown technique for preconditioning the conjugategradient method on augmented systems. A family of constraint preconditioners is proposed that provably eliminates the inherent illconditioning in the augmented system. A considerable benefit of combining constraint preconditioning with the doubly augmented system is that the preconditioner need not be applied exactly. Two particular “activeset ” constraint preconditioners are formulated that involve only a subset of the rows of the augmented system and thereby may be applied with considerably less work. Finally, some numerical experiments illustrate the numerical performance of the proposed preconditioners and highlight some theoretical properties of the preconditioned matrices.
Novel preconditioners for the iterative solution to FEdiscretized coupled consolidation equations
, 2007
"... ..."
Iterative Linear Algebra for Constrained Optimization
, 2005
"... Each step of an interior point method for nonlinear optimization requires the solution of a symmetric indefinite linear system known as a KKT system, or more generally, a saddle point problem. As the problem size increases, direct methods become prohibitively expensive to use for solving these probl ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Each step of an interior point method for nonlinear optimization requires the solution of a symmetric indefinite linear system known as a KKT system, or more generally, a saddle point problem. As the problem size increases, direct methods become prohibitively expensive to use for solving these problems; this leads to iterative solvers being the only viable alternative. In this thesis we consider iterative methods for solving saddle point systems and show that a projected preconditioned conjugate gradient method can be applied to these indefinite systems. Such a method requires the use of a specific class of preconditioners, (extended) constraint preconditioners, which exactly replicate some parts of the saddle point system that we wish to solve. The standard method for using constraint preconditioners, at least in the optimization community, has been to choose the constraint
On eigenvalue distribution of constraintpreconditioned symmetric saddle point matrices
, 2011
"... ..."
A preconditioning technique for Schur complement systems arising in stochastic optimization
"... ..."
Lowrank update of preconditioners for the inexact Newton method . . .
 MATHEMATICAL AND COMPUTER MODELLING 54 (2011) 1863–1873
, 2011
"... ..."
Using constraint preconditioners with regularized saddlepoint problems
 Comput. Optim. Appl
"... The problem of finding good preconditioners for the numerical solution of a certain important class of indefinite linear systems is considered. These systems are of a 2 by 2 block (KKT) structure in which the (2,2) block (denoted by −C) is assumed to be nonzero. In Constraint preconditioning for ind ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The problem of finding good preconditioners for the numerical solution of a certain important class of indefinite linear systems is considered. These systems are of a 2 by 2 block (KKT) structure in which the (2,2) block (denoted by −C) is assumed to be nonzero. In Constraint preconditioning for indefinite linear systems, SIAM J. Matrix Anal. Appl., 21 (2000), Keller, Gould and Wathen introduced the idea of using constraint preconditioners that have a specific 2 by 2 block structure for the case of C being zero. We shall give results concerning the spectrum and form of the eigenvectors when a preconditioner of the form considered by Keller, Gould and Wathen is used but the system we wish to solve may have C ̸ = 0. In particular, the results presented here indicate clustering of eigenvalues and, hence, faster convergence of Krylov subspace iterative methods when the entries of C are small; such a situations arise naturally in interior point methods for optimization and we present results for such problems which validate our conclusions.