Results 1  10
of
32
On Augmented Lagrangian methods with general lowerlevel constraints
, 2005
"... Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. In ..."
Abstract

Cited by 55 (6 self)
 Add to MetaCart
Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. Inexact resolution of the lowerlevel constrained subproblems is considered. Global convergence is proved using the Constant Positive Linear Dependence constraint qualification. Conditions for boundedness of the penalty parameters are discussed. The reliability of the approach is tested by means of an exhaustive comparison against Lancelot. All the problems of the Cute collection are used in this comparison. Moreover, the resolution of location problems in which many constraints of the lowerlevel set are nonlinear is addressed, employing the Spectral Projected Gradient method for solving the subproblems. Problems of this type with more than 3 × 10 6 variables and 14 × 10 6 constraints are solved in this way, using moderate computer time.
LargeScale ActiveSet BoxConstrained Optimization Method with Spectral Projected Gradients
 Computational Optimization and Applications
, 2001
"... A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradien ..."
Abstract

Cited by 55 (10 self)
 Add to MetaCart
A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradient) for dropping constraints from the working set. Global convergence is proved. A computer implementation is fully described and a numerical comparison assesses the reliability of the new algorithm. Keywords: Boxconstrained minimization, numerical methods, activeset strategies, Spectral Projected Gradient. 1
Augmented Lagrangian methods under the Constant Positive Linear Dependence constraint qualification
"... ..."
Numerical comparison of Augmented Lagrangian algorithms for nonconvex problems
 Computational Optimization and Applications
, 2004
"... Augmented Lagrangian algorithms are very popular tools for solving nonlinear programming problems. At each outer iteration of these methods a simpler optimization problem is solved, for which ecient algorithms can be used, especially when the problems are large. The most famous Augmented Lagrangi ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
Augmented Lagrangian algorithms are very popular tools for solving nonlinear programming problems. At each outer iteration of these methods a simpler optimization problem is solved, for which ecient algorithms can be used, especially when the problems are large. The most famous Augmented Lagrangian algorithm for minimization with inequality constraints is known as PowellHestenesRockafellar (PHR) method. The main drawback of PHR is that the objective function of the subproblems is not twice continuously dierentiable. This is the main motivation for the introduction of many alternative Augmented Lagrangian methods.
A BoxConstrained Optimization Algorithm With Negative Curvature Directions and Spectral Projected Gradients
, 2001
"... A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. Numerical results are presented. Keywords: box constrained minimization, active set methods, spectral projected gradients, dogleg path methods. AMS Subject Classication: 49M07, 49M10, 65K, 90C06, 90C20. 1
A new active set algorithm for box constrained optimization
 SIAM Journal on Optimization
"... ..."
Structured minimalmemory inexact quasiNewton method and secant preconditioners for Augmented Lagrangian Optimization
, 2006
"... Augmented Lagrangian methods for largescale optimization usually require efficient algorithms for minimization with box constraints. On the other hand, activeset boxconstraint methods employ unconstrained optimization algorithms for minimization inside the faces of the box. Several approaches may ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Augmented Lagrangian methods for largescale optimization usually require efficient algorithms for minimization with box constraints. On the other hand, activeset boxconstraint methods employ unconstrained optimization algorithms for minimization inside the faces of the box. Several approaches may be employed for computing internal search directions in the largescale case. In this paper a minimalmemory quasiNewton approach with secant preconditioners is proposed, taking into account the structure of Augmented Lagrangians that come from the popular PowellHestenesRockafellar scheme. A combined algorithm, that uses the quasiNewton formula or a truncatedNewton procedure, depending on the presence of active constraints in the penaltyLagrangian function, is also suggested. Numerical experiments using the Cute collection are presented.
Low OrderValue Optimization and Applications
, 2005
"... Given r real functions F1(x),..., Fr(x) and an integer p between 1 and r, the Low OrderValue Optimization problem (LOVO) consists of minimizing the sum of the functions that take the p smaller values. If (y1,..., yr) is a vector of data and T (x, ti) is the predicted value of the observation i with ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Given r real functions F1(x),..., Fr(x) and an integer p between 1 and r, the Low OrderValue Optimization problem (LOVO) consists of minimizing the sum of the functions that take the p smaller values. If (y1,..., yr) is a vector of data and T (x, ti) is the predicted value of the observation i with the parameters x ∈ IR n, it is natural to define Fi(x) = (T (x, ti) − yi) 2 (the quadratic error at observation i under the parameters x). When p = r this LOVO problem coincides with the classical nonlinear leastsquares problem. However, the interesting situation is when p is smaller than r. In that case, the solution of LOVO allows one to discard the influence of an estimated number of outliers. Thus, the LOVO problem is an interesting tool for robust estimation of parameters of nonlinear models. When p ≪ r the LOVO problem may be used to find hidden structures in data sets. One of the best succeeded applications include the Protein Alignment problem. Fully documented algorithms for this application are available at www.ime.unicamp.br/∼martinez/lovoalign. In this paper optimality conditions are discussed, algorithms for solving the LOVO problem are introduced and convergence theorems are proved. Finally, numerical experiments are presented.
BOXQUACAN and the implementation of Augmented Lagrangian algorithms for minimization with inequality constraints
, 1998
"... BOXQUACAN is a trustregion boxconstraint optimization software developed at the Applied Mathematics Department of the University of Campinas. During the last five years, it has been used for solving many practical and academic problems with box constraints and it has been incorporated as suba ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
BOXQUACAN is a trustregion boxconstraint optimization software developed at the Applied Mathematics Department of the University of Campinas. During the last five years, it has been used for solving many practical and academic problems with box constraints and it has been incorporated as subalgorithm of Augmented Lagrangian methods for minimization with equality constraints and bounds. In this paper it is described its use in connection with Augmented Lagrangian algorithms where inequality constraints are handled without the addition of slack variables. Numerical experiments comparing a modified exponential Lagrangian method and the most classical Augmented Lagrangian are presented. Institute of Mathematics, University of Campinas, CP 6065, 13081970 Campinas SP, Brazil. This work was supported by PRONEX, FAPESP (grant 9037246), FINEP, CNPq, FAEPUNICAMP. 1 1 Introduction Box constrained optimization is a well developed area of numerical analysis. It consists on the m...
Local Convergence of an InexactRestoration Method and Numerical Experiments
, 2007
"... Local convergence of an inexactrestoration method for nonlinear programming is proved. Numerical experiments are performed with the objective of evaluating the behavior of the purely local method against a globally convergent nonlinearprogramming algorithm. ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Local convergence of an inexactrestoration method for nonlinear programming is proved. Numerical experiments are performed with the objective of evaluating the behavior of the purely local method against a globally convergent nonlinearprogramming algorithm.