Results 1  10
of
10
TrustRegion InteriorPoint SQP Algorithms For A Class Of Nonlinear Programming Problems
 SIAM J. CONTROL OPTIM
, 1997
"... In this paper a family of trustregion interiorpoint SQP algorithms for the solution of a class of minimization problems with nonlinear equality constraints and simple bounds on some of the variables is described and analyzed. Such nonlinear programs arise e.g. from the discretization of optimal co ..."
Abstract

Cited by 35 (8 self)
 Add to MetaCart
In this paper a family of trustregion interiorpoint SQP algorithms for the solution of a class of minimization problems with nonlinear equality constraints and simple bounds on some of the variables is described and analyzed. Such nonlinear programs arise e.g. from the discretization of optimal control problems. The algorithms treat states and controls as independent variables. They are designed to take advantage of the structure of the problem. In particular they do not rely on matrix factorizations of the linearized constraints, but use solutions of the linearized state equation and the adjoint equation. They are well suited for large scale problems arising from optimal control problems governed by partial differential equations. The algorithms keep strict feasibility with respect to the bound constraints by using an affine scaling method proposed for a different class of problems by Coleman and Li and they exploit trustregion techniques for equalityconstrained optimizatio...
Complete Orthogonal Decomposition for Weighted Least Squares
 SIAM J. Matrix Anal. Appl
, 1995
"... Consider a fullrank weighted leastsquares problem in which the weight matrix is highly illconditioned. Because of the illconditioning, standard methods for solving leastsquares problems, QR factorization and the nullspace method for example, break down. G. W. Stewart established a norm bound fo ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Consider a fullrank weighted leastsquares problem in which the weight matrix is highly illconditioned. Because of the illconditioning, standard methods for solving leastsquares problems, QR factorization and the nullspace method for example, break down. G. W. Stewart established a norm bound for such a system of equations, indicating that it may be possible to find an algorithm that gives an accurate solution. S. A. Vavasis proposed a new definition of stability that is based on this result. He also defined the NSH algorithm for solving this leastsquares problem and showed that it satisfies his definition of stability. In this paper, we propose a complete orthogonal decomposition algorithm to solve this problem and show that it is also stable. This new algorithm is simpler and more efficient than the NSH method. 1 Introduction We consider solving the problem min y2R n kD \Gamma1=2 (Ay \Gamma b) k (1) for y, where D is a symmetric positive definite m \Theta m matrix, A is an ...
On InteriorPoint Newton Algorithms For Discretized Optimal Control Problems With State Constraints
 OPTIM. METHODS SOFTW
, 1998
"... In this paper we consider a class of nonlinear programming problems that arise from the discretization of optimal control problems with bounds on both the state and the control variables. For this class of problems, we analyze constraint qualifications and optimality conditions in detail. We derive ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
In this paper we consider a class of nonlinear programming problems that arise from the discretization of optimal control problems with bounds on both the state and the control variables. For this class of problems, we analyze constraint qualifications and optimality conditions in detail. We derive an affinescaling and two primaldual interiorpoint Newton algorithms by applying, in an interiorpoint way, Newton's method to equivalent forms of the firstorder optimality conditions. Under appropriate assumptions, the interiorpoint Newton algorithms are shown to be locally welldefined with a qquadratic rate of local convergence. By using the structure of the problem, the linear algebra of these algorithms can be reduced to the null space of the Jacobian of the equality constraints. The similarities between the three algorithms are pointed out, and their corresponding versions for the general nonlinear programming problem are discussed.
Tits. NewtonKKT interiorpoint methods for indefinite quadratic programming
 Comput. Optim. Appl
"... Two interiorpoint algorithms are proposed and analyzed, for the (local) solution of (possibly) indefinite quadratic programming problems. They are of the NewtonKKT variety in that (much like in the case of primaldual algorithms for linear programming) search directions for the “primal ” variables ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Two interiorpoint algorithms are proposed and analyzed, for the (local) solution of (possibly) indefinite quadratic programming problems. They are of the NewtonKKT variety in that (much like in the case of primaldual algorithms for linear programming) search directions for the “primal ” variables and the KarushKuhnTucker (KKT) multiplier estimates are components of the Newton (or quasiNewton)
Copositive optimization – recent developments and applications
 European Journal of Operational Research
, 2012
"... Due to its versatility, copositive optimization receives increasing interest in the Operational Research community, and is a rapidly expanding and fertile field of research. It is a special case of conic optimization, which consists of minimizing a linear function over a cone subject to linear const ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Due to its versatility, copositive optimization receives increasing interest in the Operational Research community, and is a rapidly expanding and fertile field of research. It is a special case of conic optimization, which consists of minimizing a linear function over a cone subject to linear constraints. The diversity of copositive formulations in different domains of optimization is impressive, since problem classes both in the continuous and discrete world, as well as both deterministic and stochastic models are covered. Copositivity appears in local and global optimality conditions for quadratic optimization, but can also yield tighter bounds for NPhard combinatorial optimization problems. Here some of the recent success stories are told, along with principles, algorithms and applications. 1.
An InteriorPoint Method for General LargeScale Quadratic Programming Problems
 Annals of Operations Research
, 1996
"... In this paper we present an interior point algorithm for solving both convex and nonconvex quadratic programs. The method, which is an extension of our interior point work on linear programming problems, efficiently solves a wide class of large scale problems and forms the basis for a sequential qua ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we present an interior point algorithm for solving both convex and nonconvex quadratic programs. The method, which is an extension of our interior point work on linear programming problems, efficiently solves a wide class of large scale problems and forms the basis for a sequential quadratic programming (SQP) solver for general large scale nonlinear programs. The key to the algorithm is a 3dimensional costimprovement subproblem, which is solved at every iteration. We have developed an approximate recentering procedure and a novel, adaptive bigM Phase I procedure that are essential to the success. We describe the basic method along with the recentering and bigM Phase I procedures. Details of the implementation and computational results are also presented. Keywords: bigM Phase I procedure, convex quadratic programming, interior point methods, linear programming, method of centers, multidirectional search direction, nonconvex quadratic programming, recentering. # Cont...
Stable Computation of Search Directions for NearDegenerate Linear Programming Problems
, 1997
"... ..."
An Infeasible Active Set Method for Convex Problems With Simple Bounds
, 2000
"... A primaldual active set method for convex quadratic problems with bound constraints is presented. Based on a guess on the active set, a primaldual pair (x; s) is computed that satises the rst order optimality condition and the complementarity condition. If (x; s) is not feasible, a new active set ..."
Abstract
 Add to MetaCart
A primaldual active set method for convex quadratic problems with bound constraints is presented. Based on a guess on the active set, a primaldual pair (x; s) is computed that satises the rst order optimality condition and the complementarity condition. If (x; s) is not feasible, a new active set is determined, and the process is iterated. Sucient conditions for the iterations to stop in a nite number of steps with an optimal solution are provided. Computational experience indicates that this approach often requires only a few (less than 10) iterations to nd the optimal solution. 1 Introduction We consider the convex programming problem (P ) minJ(x) subject to x b 0; (1) where J(x) := 1 2 x T Qx + d T x; Q is a positive denite nn matrix, and b; d 2 IR n . This problem has received considerable interest in the literature. We recall some of the more recent contributions. Supported in part by the Fonds zur Forderung der wissenschaftlichen Forschung (FWF), Austria, ...
Recovery Of Blocky Images
 SIAM J. Appl. Math
, 1996
"... The purpose of this investigation is to understand situations under which an enhancement method succeeds in recovering an image from data which are noisy and blurred. The method in question is due to Rudin and Osher. The method selects, from a class of feasible images, one that has the least total v ..."
Abstract
 Add to MetaCart
The purpose of this investigation is to understand situations under which an enhancement method succeeds in recovering an image from data which are noisy and blurred. The method in question is due to Rudin and Osher. The method selects, from a class of feasible images, one that has the least total variation.