Results 1  10
of
309
Sequential Quadratic Programming
, 1995
"... this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can ..."
Abstract

Cited by 166 (4 self)
 Add to MetaCart
this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can
SBA: a software package for generic sparse bundle adjustment
 ACM Transactions on Mathematical Software
, 2009
"... Foundation for Research and Technology—Hellas ..."
Newton's Method For Large BoundConstrained Optimization Problems
 SIAM JOURNAL ON OPTIMIZATION
, 1998
"... We analyze a trust region version of Newton's method for boundconstrained problems. Our approach relies on the geometry of the feasible set, not on the particular representation in terms of constraints. The convergence theory holds for linearlyconstrained problems, and yields global and super ..."
Abstract

Cited by 107 (5 self)
 Add to MetaCart
We analyze a trust region version of Newton's method for boundconstrained problems. Our approach relies on the geometry of the feasible set, not on the particular representation in terms of constraints. The convergence theory holds for linearlyconstrained problems, and yields global and superlinear convergence without assuming neither strict complementarity nor linear independence of the active constraints. We also show that the convergence theory leads to an efficient implementation for large boundconstrained problems.
A reflective Newton method for minimizing a quadratic function subject to bounds on some of the variables
, 1992
"... . We propose a new algorithm, a reflective Newton method, for the minimization of a quadratic function of many variables subject to upper and lower bounds on some of the variables. The method applies to a general (indefinite) quadratic function, for which a local minimizer subject to bounds is requi ..."
Abstract

Cited by 97 (3 self)
 Add to MetaCart
. We propose a new algorithm, a reflective Newton method, for the minimization of a quadratic function of many variables subject to upper and lower bounds on some of the variables. The method applies to a general (indefinite) quadratic function, for which a local minimizer subject to bounds is required, and is particularily suitable for the largescale problem. Our new method exhibits strong convergence properties, global and quadratic convergence, and appears to have significant practical potential. Strictly feasible points are generated. Experimental results on moderately large and sparse problems support the claim of practicality for largescale problems. 1 Research partially supported by the Applied Mathematical Sciences Research Program (KC04 02) of the Office of Energy Research of the U.S. Department of Energy under grant DEFG0286ER25013. A000, and by the Computational Mathematics Program of the National Science Foundation under grant DMS8706133, and by the Cornell Theory Cen...
On Augmented Lagrangian methods with general lowerlevel constraints
, 2005
"... Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. In ..."
Abstract

Cited by 84 (7 self)
 Add to MetaCart
(Show Context)
Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. Inexact resolution of the lowerlevel constrained subproblems is considered. Global convergence is proved using the Constant Positive Linear Dependence constraint qualification. Conditions for boundedness of the penalty parameters are discussed. The reliability of the approach is tested by means of an exhaustive comparison against Lancelot. All the problems of the Cute collection are used in this comparison. Moreover, the resolution of location problems in which many constraints of the lowerlevel set are nonlinear is addressed, employing the Spectral Projected Gradient method for solving the subproblems. Problems of this type with more than 3 × 10 6 variables and 14 × 10 6 constraints are solved in this way, using moderate computer time.
Global Continuation For Distance Geometry Problems
 SIAM J. OPTIMIZATION
, 1995
"... Distance geometry problems arise in the interpretation of NMR data and in the determination of protein structure. We formulate the distance geometry problem as a global minimization problem with special structure, and show that global smoothing techniques and a continuation approach for global optim ..."
Abstract

Cited by 83 (7 self)
 Add to MetaCart
Distance geometry problems arise in the interpretation of NMR data and in the determination of protein structure. We formulate the distance geometry problem as a global minimization problem with special structure, and show that global smoothing techniques and a continuation approach for global optimization can be used to determine solutions of distance geometry problems with a nearly 100% probability of success.
Space Mapping: The State of the Art
, 2004
"... We review the spacemapping (SM) technique and the SMbased surrogate (modeling) concept and their applications in engineering design optimization. For the first time, we present a mathematical motivation and place SM into the context of classical optimization. The aim of SM is to achieve a satisfac ..."
Abstract

Cited by 80 (33 self)
 Add to MetaCart
We review the spacemapping (SM) technique and the SMbased surrogate (modeling) concept and their applications in engineering design optimization. For the first time, we present a mathematical motivation and place SM into the context of classical optimization. The aim of SM is to achieve a satisfactory solution with a minimal number of computationally expensive "fine" model evaluations. SM procedures iteratively update and optimize surrogates based on a fast physically based "coarse" model. Proposed approaches to SMbased optimization include the original algorithm, the Broydenbased aggressive SM algorithm, various trustregion approaches, neural SM, and implicit SM. Parameter extraction is an essential SM subproblem. It is used to align the surrogate (enhanced coarse model) with the fine model. Different approaches to enhance uniqueness are suggested, including the recent gradient parameterextraction approach. Novel physical illustrations are presented, including the cheesecutting and wedgecutting problems. Significant practical applications are reviewed.
A semidefinite framework for trust region subproblems with applications to large scale minimization
, 2002
"... ..."
(Show Context)
Indefinite Trust Region Subproblems And Nonsymmetric Eigenvalue Perturbations
, 1995
"... This paper extends the theory of trust region subproblems in two ways: (i) it allows indefinite inner products in the quadratic constraint and (ii) it uses a two sided (upper and lower bound) quadratic constraint. Characterizations of optimality are presented, which have no gap between necessity and ..."
Abstract

Cited by 73 (18 self)
 Add to MetaCart
This paper extends the theory of trust region subproblems in two ways: (i) it allows indefinite inner products in the quadratic constraint and (ii) it uses a two sided (upper and lower bound) quadratic constraint. Characterizations of optimality are presented, which have no gap between necessity and sufficiency. Conditions for the existence of solutions are given in terms of the definiteness of a matrix pencil. A simple dual program is intro...
A New Trust Region Algorithm For Equality Constrained Optimization
, 1995
"... . We present a new trust region algorithm for solving nonlinear equality constrained optimization problems. At each iterate a change of variables is performed to improve the ability of the algorithm to follow the constraint level sets. The algorithm employs L 2 penalty functions for obtaining global ..."
Abstract

Cited by 72 (7 self)
 Add to MetaCart
. We present a new trust region algorithm for solving nonlinear equality constrained optimization problems. At each iterate a change of variables is performed to improve the ability of the algorithm to follow the constraint level sets. The algorithm employs L 2 penalty functions for obtaining global convergence. Under certain assumptions we prove that this algorithm globally converges to a point satisfying the second order necessary optimality conditions; the local convergence rate is quadratic. Results of preliminary numerical experiments are presented. 1. Introduction. We consider the equality constrained optimization problem minimize f(x) subject to c(x) = 0 (1:1) where x 2 ! n and f : ! n ! !, and c : ! n ! ! m are smooth nonlinear functions. Problem (1.1) is often solved by successive quadratic programming (SQP) methods. At a current point x k 2 ! n , SQP methods determine a search direction d k by solving a quadratic programming problem minimize rf(x k ) T d + 1 2 ...