Results 1  10
of
13
Complete search in continuous global optimization and constraint satisfaction, Acta Numerica 13
, 2004
"... A chapter for ..."
Global Optimization of MixedInteger Nonlinear Programs: A Theoretical and Computational Study
 Mathematical Programming
, 2003
"... This work addresses the development of an efficient solution strategy for obtaining global optima of continuous, integer, and mixedinteger nonlinear programs. Towards this end, we develop novel relaxation schemes, range reduction tests, and branching strategies which we incorporate into the prototy ..."
Abstract

Cited by 50 (1 self)
 Add to MetaCart
This work addresses the development of an efficient solution strategy for obtaining global optima of continuous, integer, and mixedinteger nonlinear programs. Towards this end, we develop novel relaxation schemes, range reduction tests, and branching strategies which we incorporate into the prototypical branchandbound algorithm. In the theoretical...
Global minimization using an Augmented Lagrangian method with variable lowerlevel constraints
, 2007
"... A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global c ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global convergence to an εglobal minimizer of the original problem is proved. The subproblems are solved using the αBB method. Numerical experiments are presented.
Reformulation and Convex Relaxation Techniques for Global Optimization
 4OR
, 2004
"... Many engineering optimization problems can be formulated as nonconvex nonlinear programming problems (NLPs) involving a nonlinear objective function subject to nonlinear constraints. Such problems may exhibit more than one locally optimal point. However, one is often solely or primarily interested i ..."
Abstract

Cited by 9 (7 self)
 Add to MetaCart
Many engineering optimization problems can be formulated as nonconvex nonlinear programming problems (NLPs) involving a nonlinear objective function subject to nonlinear constraints. Such problems may exhibit more than one locally optimal point. However, one is often solely or primarily interested in determining the globally optimal point. This thesis is concerned with techniques for establishing such global optima using spatial BranchandBound (sBB) algorithms.
An interval partitioning approach for continuous constrained optimization
 Models and Algorithms in Global Optimization
, 2006
"... Constrained Optimization Problems (COP’s) are encountered in many scientific fields concerned with industrial applications such as kinematics, chemical process optimization, molecular design, etc. When nonlinear relationships among variables are defined by problem constraints resulting in nonconv ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Constrained Optimization Problems (COP’s) are encountered in many scientific fields concerned with industrial applications such as kinematics, chemical process optimization, molecular design, etc. When nonlinear relationships among variables are defined by problem constraints resulting in nonconvex feasible sets, the problem of identifying feasible solutions may become very hard. Consequently, finding the location of the global optimum in the COP is more difficult as compared to boundconstrained global optimization problems. This chapter proposes a new interval partitioning method for solving the COP. The proposed approach involves a new subdivision direction selection method as well as an adaptive search tree framework where nodes (boxes defining different variable domains) are explored using a restricted hybrid depthfirst and bestfirst branching strategy. This hybrid approach is also used for activating local search in boxes with the aim of identifying different feasible stationary points. The proposed search tree management approach improves the convergence speed of the interval partitioning method that is also supported by the new parallel subdivision direction selection rule
Feasibilitybased bounds tightening via fixed points
"... Abstract. The search tree size of the spatial BranchandBound algorithm for MixedInteger Nonlinear Programming depends on many factors, one of which is the width of the variable ranges at every tree node. A range reduction technique often employed is called Feasibility Based Bounds Tightening, whi ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. The search tree size of the spatial BranchandBound algorithm for MixedInteger Nonlinear Programming depends on many factors, one of which is the width of the variable ranges at every tree node. A range reduction technique often employed is called Feasibility Based Bounds Tightening, which is known to be practically fast, and is thus deployed at every node of the search tree. From time to time, however, this technique fails to converge to its limit point in finite time, thereby slowing the whole BranchandBound search considerably. In this paper we propose a polynomial time method, based on solving a linear program, for computing the limit point of the Feasibility Based Bounds Tightening algorithm applied to linear equality and inequality constraints. Keywords: global optimization, MINLP, spatial BranchandBound, range reduction, constraint programming. 1
A cutting surface method for uncertain linear programs with polyhedral stochastic dominance constraints
 SIAM Journal on Optimization
"... In this paper we study linear optimization problems with multidimensional linear positive secondorder stochastic dominance constraints. By using the polyhedral properties of the secondorder linear dominance condition we present a cuttingsurface algorithm, and show its finite convergence. The cut ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In this paper we study linear optimization problems with multidimensional linear positive secondorder stochastic dominance constraints. By using the polyhedral properties of the secondorder linear dominance condition we present a cuttingsurface algorithm, and show its finite convergence. The cut generation problem is a difference of convex functions (DC) optimization problem. We exploit the polyhedral structure of this problem to present a novel branchandcut algorithm that incorporates concepts from concave minimization and binary integer programming. A linear programming problem is formulated for generating concavity cuts in our case, where the polyhedra is unbounded. We also present duality results for this problem relating the dual multipliers to utility functions, without the need to impose constraint qualifications, which again is possible because of the polyhedral nature of the problem. Numerical examples are presented showing the nature of solutions of our model.
On Intervalsubgradient and Nogood Cuts
 OPERATIONS RESEARCH LETTERS
, 2010
"... Intervalgradient cuts are (nonlinear) valid inequalities for nonconvex NLPs defined for constraints g(x) ≤ 0 with g being continuously differentiable in a box [x, ¯x]. In this paper we define intervalsubgradient cuts, a generalization to the case of nondifferentiable g, and show that nogood cuts ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Intervalgradient cuts are (nonlinear) valid inequalities for nonconvex NLPs defined for constraints g(x) ≤ 0 with g being continuously differentiable in a box [x, ¯x]. In this paper we define intervalsubgradient cuts, a generalization to the case of nondifferentiable g, and show that nogood cuts (which have the form ‖x−ˆx ‖ ≥ ε for some norm and positive constant ε) are a special case of intervalsubgradient cuts whenever the 1norm is used. We then briefly discuss what happens if other norms are used.
Global Nonlinear Programming with possible infeasibility and finite termination
, 2012
"... In a recent paper, Birgin, Floudas and Martínez introduced an augmented Lagrangian method for global optimization. In their approach, augmented Lagrangian subproblems are solved using the αBB method and convergence to global minimizers was obtained assuming feasibility of the original problem. In th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In a recent paper, Birgin, Floudas and Martínez introduced an augmented Lagrangian method for global optimization. In their approach, augmented Lagrangian subproblems are solved using the αBB method and convergence to global minimizers was obtained assuming feasibility of the original problem. In the present research, the algorithm mentioned above will be improved in several crucial aspects. On the one hand, feasibility of the problem will not be required. Possible infeasibility will be detected in finite time by the new algorithms and optimal infeasibility results will be proved. On the other hand, finite termination results thatguaranteeoptimalityand/orfeasibilityuptoanyrequiredprecisionwillbeprovided. An adaptive modification in which subproblem tolerances depend on current feasibility and complementarity will also be given. The adaptive algorithm allows the augmented Lagrangian subproblems to be solved without requiring unnecessary potentially high precisions in the intermediate steps of the method, which improves the overall efficiency. Experiments showing how the new algorithms and results are related to practical computations will be given.