Results 1 
5 of
5
On the implementation of an interiorpoint filter linesearch algorithm for largescale nonlinear programming
 Mathematical Programming
, 2006
"... We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration pha ..."
Abstract

Cited by 144 (5 self)
 Add to MetaCart
(Show Context)
We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration phase for the filter method, secondorder corrections, and inertia correction of the KKT matrix. Heuristics are also considered that allow faster performance. This method has been implemented in the IPOPT code, which we demonstrate in a detailed numerical study based on 954 problems from the CUTEr test set. An evaluation is made of several linesearch options, and a comparison is provided with two stateoftheart interiorpoint codes for nonlinear programming.
InteriorPoint l_2Penalty Methods for Nonlinear Programming with Strong Global Convergence Properties
 Math. Programming
, 2004
"... We propose two line search primaldual interiorpoint methods that have a generic barrierSQP outer structure and approximately solve a sequence of equality constrained barrier subproblems. To enforce convergence for each subproblem, these methods use an # 2 exact penalty function eliminating the n ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We propose two line search primaldual interiorpoint methods that have a generic barrierSQP outer structure and approximately solve a sequence of equality constrained barrier subproblems. To enforce convergence for each subproblem, these methods use an # 2 exact penalty function eliminating the need to drive the corresponding penalty parameter to infinity when finite multipliers exist. Instead of directly decreasing an equality constraint infeasibility measure, these methods attain feasibility by forcing this measure to zero whenever the steps generated by the methods tend to zero. Our analysis shows that under standard assumptions, our methods have strong global convergence properties. Specifically, we show that if the penalty parameter remains bounded, any limit point of the iterate sequence is either a KKT point of the barrier subproblem, or a FritzJohn (FJ) point of the original problem that fails to satisfy the MangasarianFromovitz constraint qualification (MFCQ); if the penalty parameter tends to infinity, there is a limit point that is either an infeasible FJ point of the inequality constrained feasibility problem (an infeasible stationary point of the infeasibility measure if slack variables are added) or a FJ point of the original problem at which the MFCQ fails to hold. Numerical results are given that illustrate these outcomes.
Trust Region SQP Methods With Inexact Linear System Solves For LargeScale Optimization
, 2006
"... by ..."
(Show Context)
Digital Filter Stepsize Control of DASPK and its Effect on Control Optimization Performance, M.Sc. Thesis
 M.Sc. Thesis, UCSB, 2004
, 2004
"... It has long been known that the solutions produced by adaptive solvers for ordinary differential (ODE) and differential algebraic (DAE) systems, while generally reliable, are not smooth with respect to perturbations in initial conditions or other problem parameters. Söderlind and Wang [12, 13] have ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
It has long been known that the solutions produced by adaptive solvers for ordinary differential (ODE) and differential algebraic (DAE) systems, while generally reliable, are not smooth with respect to perturbations in initial conditions or other problem parameters. Söderlind and Wang [12, 13] have recently developed a digital filter stepsize controller that has a theoretical basis from control and appears to result in a much smoother dependence of the solution on problem parameters. This property seems particularly important in the control and optimization of dynamical systems, where the optimizer is generally expecting the DAE solver to return solutions that vary smoothly with respect to the parameters. We have implemented the digital filter stepsize controller in the DAE solver DASP K3.1, and used the new solver for the optimization of dynamical systems. The improved performance of the optimizer, as a result of the new stepsize controller, is demonstrated on a biological problem regarding the heat shock response of Escherichia coli. ∗This work was supported by DOE DEFG0300ER25430, NSF/ITR ACI0086061, NSF CTS
CHALLENGES AND RESEARCH ISSUES FOR PRODUCT AND PROCESS DESIGN OPTIMIZATION
"... Optimization as an enabling technology has been one of the big success stories in process systems engineering. In this paper we present first a general review of optimization and its applications to a variety of problems in process systems engineering. Next, we provide an overview of two key areas: ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Optimization as an enabling technology has been one of the big success stories in process systems engineering. In this paper we present first a general review of optimization and its applications to a variety of problems in process systems engineering. Next, we provide an overview of two key areas: nonlinear programming and logicbased discrete/continuous optimization. In particular, recent advances are presented in the modeling and solution of nonlinear programming, dynamic optimization, mixedinteger and generalized disjunctive programming, global optimization and constraint programming. The impact of these techniques is illustrated with several example problems.