Results 1  10
of
135
Complete search in continuous global optimization and constraint satisfaction, Acta Numerica 13
, 2004
"... A chapter for ..."
(Show Context)
Review of nonlinear mixedinteger and disjunctive programming techniques
 Optimization and Engineering
, 2002
"... This paper has as a major objective to present a unified overview and derivation of mixedinteger nonlinear programming (MINLP) techniques, Branch and Bound, OuterApproximation, Generalized Benders and Extended Cutting Plane methods, as applied to nonlinear discrete optimization problems that are ex ..."
Abstract

Cited by 61 (15 self)
 Add to MetaCart
This paper has as a major objective to present a unified overview and derivation of mixedinteger nonlinear programming (MINLP) techniques, Branch and Bound, OuterApproximation, Generalized Benders and Extended Cutting Plane methods, as applied to nonlinear discrete optimization problems that are expressed in algebraic form. The solution of MINLP problems with convex functions is presented first, followed by a brief discussion on extensions for the nonconvex case. The solution of logic based representations, known as generalized disjunctive programs, is also described. Theoretical properties are presented, and numerical comparisons on a small process network problem.
Global Optimization of MixedInteger Nonlinear Programs: A Theoretical and Computational Study
 Mathematical Programming
, 2003
"... This work addresses the development of an efficient solution strategy for obtaining global optima of continuous, integer, and mixedinteger nonlinear programs. Towards this end, we develop novel relaxation schemes, range reduction tests, and branching strategies which we incorporate into the prototy ..."
Abstract

Cited by 58 (1 self)
 Add to MetaCart
This work addresses the development of an efficient solution strategy for obtaining global optima of continuous, integer, and mixedinteger nonlinear programs. Towards this end, we develop novel relaxation schemes, range reduction tests, and branching strategies which we incorporate into the prototypical branchandbound algorithm. In the theoretical...
A Global Optimization Method, αBB, for General TwiceDifferentiable Constrained NLPs: I  Theoretical Advances
, 1997
"... In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the constru ..."
Abstract

Cited by 55 (4 self)
 Add to MetaCart
In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the construction of a converging sequence of upper and lower bounds on the global minimum through the convex relaxation of the original problem. This relaxation is obtained by (i) replacing all nonconvex terms of special structure (i.e., bilinear, trilinear, fractional, fractional trilinear, univariate concave) with customized tight convex lower bounding functions and (ii) by utilizing some α parameters as defined by Maranas and Floudas (1994b) to generate valid convex underestimators for nonconvex terms of generic structure. In most cases, the calculation of appropriate values for the α parameters is a challenging task. A number of approaches are proposed, which rigorously generate a set of α par...
Interval Analysis on Directed Acyclic Graphs for Global Optimization
 J. Global Optimization
, 2004
"... A directed acyclic graph (DAG) representation of optimization problems represents each variable, each operation, and each constraint in the problem formulation by a node of the DAG, with edges representing the ow of the computation. ..."
Abstract

Cited by 41 (8 self)
 Add to MetaCart
(Show Context)
A directed acyclic graph (DAG) representation of optimization problems represents each variable, each operation, and each constraint in the problem formulation by a node of the DAG, with edges representing the ow of the computation.
Global Optimization of MixedInteger Nonlinear Problems
 AIChE J
"... Two novel deterministic global optimization algorithms for nonconvex mixedinteger problems (MINLPs) are proposed, using the advances of the ffBB algorithm for nonconvex NLPs Adjiman et al. (1998a). The Special Structure MixedInteger ffBB algorithm (SMINffBB addresses problems with nonconvexities ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
Two novel deterministic global optimization algorithms for nonconvex mixedinteger problems (MINLPs) are proposed, using the advances of the ffBB algorithm for nonconvex NLPs Adjiman et al. (1998a). The Special Structure MixedInteger ffBB algorithm (SMINffBB addresses problems with nonconvexities in the continuous variables and linear and mixedbilinear participation of the binary variables. The General Structure MixedInteger ffBB algorithm (GMINffBB), is applicable to a very general class of problems for which the continuous relaxation is twice continuously differentiable. Both algorithms are developed using the concepts of branchandbound, but they differ in their approach to each of the required steps. The SMINffBB algorithm is based on the convex underestimation of the continuous functions while the GMINffBB algorithm is centered around the convex relaxation of the entire problem. Both algorithms rely on optimization or interval based variable bound updates to enhance effici...
An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms
 Journal of Global Optimization
, 2005
"... Many nonconvex nonlinear programming (NLP) problems of practical interest involve bilinear terms and linear constraints, as well as, potentially, other convex and nonconvex terms and constraints. In such cases, it may be possible to augment the formulation with additional linear constraints (a subse ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
(Show Context)
Many nonconvex nonlinear programming (NLP) problems of practical interest involve bilinear terms and linear constraints, as well as, potentially, other convex and nonconvex terms and constraints. In such cases, it may be possible to augment the formulation with additional linear constraints (a subset of ReformulationLinearization Technique constraints) which do not a#ect the feasible region of the original NLP but tighten that of its convex relaxation to the extent that some bilinear terms may be dropped from the problem formulation. We present an e#cient graphtheoretical algorithm for e#ecting such exact reformulations of large, sparse NLPs. The global solution of the reformulated problem using spatial Branchand Bound algorithms is usually significantly faster than that of the original NLP. We illustrate this point by applying our algorithm to a set of pooling and blending global optimization problems.
REFORMULATIONS IN MATHEMATICAL PROGRAMMING: DEFINITIONS AND SYSTEMATICS
, 2008
"... A reformulation of a mathematical program is a formulation which shares some properties with, but is in some sense better than, the original program. Reformulations are important with respect to the choice and efficiency of the solution algorithms; furthermore, it is desirable that reformulations c ..."
Abstract

Cited by 19 (14 self)
 Add to MetaCart
A reformulation of a mathematical program is a formulation which shares some properties with, but is in some sense better than, the original program. Reformulations are important with respect to the choice and efficiency of the solution algorithms; furthermore, it is desirable that reformulations can be carried out automatically. Reformulation techniques are very common in mathematical programming but interestingly they have never been studied under a common framework. This paper attempts to move some steps in this direction. We define a framework for storing and manipulating mathematical programming formulations, give several fundamental definitions categorizing reformulations in essentially four types (optreformulations, narrowings, relaxations and approximations). We establish some theoretical results and give reformulation examples for each type.
A Global Optimization Algorithm for Nonconvex Generalized Disjunctive Programming and Applications to Process Systems
 Computers and Chemical Engineering
, 2000
"... A global optimization algorithm for nonconvex Generalized Disjunctive Programming (GDP) problems is proposed in this paper. By making use of convex underestimating functions for bilinear, linear fractional and concave separable functions in the continuous variables, the convex hull of each nonlinear ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
A global optimization algorithm for nonconvex Generalized Disjunctive Programming (GDP) problems is proposed in this paper. By making use of convex underestimating functions for bilinear, linear fractional and concave separable functions in the continuous variables, the convex hull of each nonlinear disjunction is constructed. The relaxed convex GDP problem is then solved in the first level of a twolevel branch and bound algorithm, in which a discrete branch and bound search is performed on the disjunctions to predict lower bounds. In the second level, a spatial branch and bound method is used to solve nonconvex NLP problems for updating the upper bound. The proposed algorithm exploits the convex hull relaxation for the discrete search, and the fact that the spatial branch and bound is restricted to fixed discrete variables in order to predict tight lower bounds. Application of the proposed algorithm to several example problems is shown, as well as comparisons with other algorithms.
Global Optimization in Parameter Estimation of Nonlinear Algebraic Models via the ErrorInVariables Approach
 Ind. Eng. Chem. Res
, 1998
"... The estimation of parameters in nonlinear algebraic models through the errorinvariables method has been widely studied from a computational standpoint. The method involves the minimization of a weighted sum of squared errors subject to the model equations. Due to the nonlinear nature of the models ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
(Show Context)
The estimation of parameters in nonlinear algebraic models through the errorinvariables method has been widely studied from a computational standpoint. The method involves the minimization of a weighted sum of squared errors subject to the model equations. Due to the nonlinear nature of the models used, the resulting formulation is nonconvex, and may contain several local minima in the region of interest. Current methods tailored for this formulation, although computationally efficient, can only attain convergence to a local solution. In this paper, a global optimization approach based on a branch and bound framework and convexification techniques for general twice differentiable nonlinear optimization problems is proposed for the parameter estimation of nonlinear algebraic models. The proposed convexification techniques exploit the mathematical properties of the formulation. Classical nonlinear estimation problems were solved and will be used to illustrate the various theoretical an...