Results 1 
6 of
6
Sequential Quadratic Programming
, 1995
"... this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can ..."
Abstract

Cited by 117 (3 self)
 Add to MetaCart
this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can
A Modified BFGS Method and Its Global Convergence in Nonconvex Minimization
, 1998
"... In this paper, we propose a modication of the BFGS method for unconstrained optimization. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. Under certain conditions, we also establish superlinea ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In this paper, we propose a modication of the BFGS method for unconstrained optimization. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. Under certain conditions, we also establish superlinear convergence of the method. Key words: BFGS method, global convergence, superlinear convergence 1 Present address (available before October, 1999): Department of Applied Mathematics and Physics, Graduate School of Engineering, Kyoto University, Kyoto 606, Japan, email: lidh@kuamp.kyotou.ac.jp 1 Introduction Let f : R n ! R be continuously dierentiable. Consider the following unconstrained optimization problem: min f(x); x 2 R n : (1:1) Among numerous iterative methods for solving (1.1), quasiNewton methods constitute particularly important class. Throughout the paper, we assume that f in (1.1) has Lipschitz continuous gradients, i.e. there is a constant L > 0 such kg(x) g(y)k ...
A Globally and Superlinearly Convergent GaussNewton Based BFGS Method for Symmetric Equations
, 1998
"... In this paper, we present a GaussNewton based BFGS method for solving symmetric nonlinear equations which contain, as a special case, an unconstrained optimization problem, a saddle point problem and an equality constrained optimization problem. A suitable line search is proposed with which the pre ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
In this paper, we present a GaussNewton based BFGS method for solving symmetric nonlinear equations which contain, as a special case, an unconstrained optimization problem, a saddle point problem and an equality constrained optimization problem. A suitable line search is proposed with which the presented BFGS method exhibits an approximate norm descent property. Under appropriate conditions, global convergence and superlinear convergence of the method are established. Key words: BFGS method, global convergence, superlinear convergence, symmetric equations 1 Present address (available until October, 1999): Department of Applied Mathematics and Physics, Graduate School of Engineering, Kyoto University, Kyoto 606, Japan, email: lidh@kuamp.kyotou.ac.jp 1 Introduction During the past two decades, much eort has been made to establish global convergence of quasiNewton methods, especially for convex unconstrained minimization problems (e.g. [2], [3], [11], [14], [15], [16], [17]). We ...
A DerivativeFree Line Search and DFP Method for Symmetric Equations with Global and Superlinear Convergence
 Numer. Funct. Anal. Optim
, 1998
"... In this paper, we propose a derivativefree line search suited to iterative methods for solving systems of nonlinear equations with symmetric Jacobian matrices. The proposed line search can be implemented conveniently by a backtracking process and has such an attractive property that any iterative m ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper, we propose a derivativefree line search suited to iterative methods for solving systems of nonlinear equations with symmetric Jacobian matrices. The proposed line search can be implemented conveniently by a backtracking process and has such an attractive property that any iterative method with this line search generates a sequence of iterates that is approximately norm descent. Moreover, if the Jacobian matrices are uniformly nonsingular, then the generated sequence converges to the unique solution. We incorporate this line search with a GaussNewton based DFP method for solving symmetric equations. Under appropriate conditions, we establish global and superlinear convergence of the proposed DFP method. The obtained results show, in particular, that the proposed DFP method with inexact line search converges globally and superlinearly even for nonconvex unconstrained optimization problems and equality constrained optimization problems.
Application of Polytopic Separation Techniques to Nonlinear Observer Design
, 2005
"... Output feedback control of nonlinear systems is an important open research topic which attracts attention for both its theoretical interest and its practical applications. This thesis relies on the theory developed by Maggiore and Passino in [27] towards a separation principle for nonlinear systems. ..."
Abstract
 Add to MetaCart
Output feedback control of nonlinear systems is an important open research topic which attracts attention for both its theoretical interest and its practical applications. This thesis relies on the theory developed by Maggiore and Passino in [27] towards a separation principle for nonlinear systems. That work crucially relies on the existence and construction of a set P, enjoying special properties, separating two nonconvex sets. The class of sets P we choose to work with is that of polytopes. In this thesis we develop two algorithms to accomplish the task of separating two nonconvex sets. The first algorithm relies on semiinfinite programming. The second algorithm relies on orthogonal projection and outer polytopic approximation. Both algorithms are first tested on basic examples and later used to design nonlinear observers for the MooreGreitzer threestate model of surge and stall in jet engine compressors.
Retaining Convergence Properties of Trust Region Methods Without Extra Gradient Evaluations
"... Several recent computational studies have shown that trustregion quasiNewton methods using the SR1, PSB, and BFGS updates are e#ective methods for solving unconstrained optimization problems. In addition, the analyses in Powell [1975] and Byrd, Khalfan, and Schnabel [1993] demonstrate strong conve ..."
Abstract
 Add to MetaCart
Several recent computational studies have shown that trustregion quasiNewton methods using the SR1, PSB, and BFGS updates are e#ective methods for solving unconstrained optimization problems. In addition, the analyses in Powell [1975] and Byrd, Khalfan, and Schnabel [1993] demonstrate strong convergence properties for some trustregion quasiNewton methods. A computational disadvantage of the methods analyzed in these papers, for which the strongest convergence properties among trustregion quasiNewton methods have been shown, is that the update at rejected trial points requires a gradient evaluation that would not otherwise be made. In this paper, we consider a modification of these methods that makes a di#erent, less expensive update at rejected trial points. In particular, we propose a modification of the PSB method that uses only the function value at rejected points to make the update at those points. We then show how to modify Powell's analysis of the PSB method to prove the s...