Results 1  10
of
38
Review of nonlinear mixedinteger and disjunctive programming techniques
 Optimization and Engineering
, 2002
"... This paper has as a major objective to present a unified overview and derivation of mixedinteger nonlinear programming (MINLP) techniques, Branch and Bound, OuterApproximation, Generalized Benders and Extended Cutting Plane methods, as applied to nonlinear discrete optimization problems that are ex ..."
Abstract

Cited by 55 (15 self)
 Add to MetaCart
This paper has as a major objective to present a unified overview and derivation of mixedinteger nonlinear programming (MINLP) techniques, Branch and Bound, OuterApproximation, Generalized Benders and Extended Cutting Plane methods, as applied to nonlinear discrete optimization problems that are expressed in algebraic form. The solution of MINLP problems with convex functions is presented first, followed by a brief discussion on extensions for the nonconvex case. The solution of logic based representations, known as generalized disjunctive programs, is also described. Theoretical properties are presented, and numerical comparisons on a small process network problem.
Global Optimization of MixedInteger Nonlinear Programs: A Theoretical and Computational Study
 Mathematical Programming
, 2003
"... This work addresses the development of an efficient solution strategy for obtaining global optima of continuous, integer, and mixedinteger nonlinear programs. Towards this end, we develop novel relaxation schemes, range reduction tests, and branching strategies which we incorporate into the prototy ..."
Abstract

Cited by 51 (1 self)
 Add to MetaCart
This work addresses the development of an efficient solution strategy for obtaining global optima of continuous, integer, and mixedinteger nonlinear programs. Towards this end, we develop novel relaxation schemes, range reduction tests, and branching strategies which we incorporate into the prototypical branchandbound algorithm. In the theoretical...
Rigorous Convex Underestimators for General TwiceDifferentiable Problems
 Journal of Global Optimization
, 1996
"... . In order to generate valid convex lower bounding problems for nonconvex twicedifferentiable optimization problems, a method that is based on second order information of general twicedifferentiable functions is presented. Using interval Hessian matrices, valid lower bounds on the eigenvalues ..."
Abstract

Cited by 35 (15 self)
 Add to MetaCart
. In order to generate valid convex lower bounding problems for nonconvex twicedifferentiable optimization problems, a method that is based on second order information of general twicedifferentiable functions is presented. Using interval Hessian matrices, valid lower bounds on the eigenvalues of such functions are obtained and used in constructing convex underestimators. By solving several nonlinear example problems, it is shown that the lower bounds are sufficiently tight to ensure satisfactory convergence of the ffBB, a branch and bound algorithm which relies on this underestimation procedure [3]. Key words: convex underestimators; twicedifferentiable; interval anlysis; eigenvalues 1. Introduction The mathematical description of many physical phenomena, such as phase equilibrium, or of chemical processes generally requires the introduction of nonconvex functions. As the number of local solutions to a nonconvex optimization problem cannot be predicted a priori, the identifi...
Global minimization using an Augmented Lagrangian method with variable lowerlevel constraints
, 2007
"... A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global c ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global convergence to an εglobal minimizer of the original problem is proved. The subproblems are solved using the αBB method. Numerical experiments are presented.
Reformulations in Mathematical Programming: A Computational Approach
"... Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathema ..."
Abstract

Cited by 17 (13 self)
 Add to MetaCart
Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathematical expressions of the parameters and decision variables, and therefore excludes optimization of blackbox functions. A reformulation of a mathematical program P is a mathematical program Q obtained from P via symbolic transformations applied to the sets of variables, objectives and constraints. We present a survey of existing reformulations interpreted along these lines, some example applications, and describe the implementation of a software framework for reformulation and optimization. 1
Global Optimization of MINLP Problems in Process Synthesis and Design
 Computers & Chemical Engineering
, 1997
"... : Two new methodologies for the global optimization of MINLP models, the Special structure Mixed Integer Nonlinear ffBB, SMINffBB, and the General structure Mixed Integer Nonlinear ffBB, GMINffBB, are presented. Their theoretical foundations provide guarantees that the global optimum solution of ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
: Two new methodologies for the global optimization of MINLP models, the Special structure Mixed Integer Nonlinear ffBB, SMINffBB, and the General structure Mixed Integer Nonlinear ffBB, GMINffBB, are presented. Their theoretical foundations provide guarantees that the global optimum solution of MINLPs involving twicedifferentiable nonconvex functions in the continuous variables can be identified. The conditions imposed on the functionality of the binary variables differ for each method : linear and mixed bilinear terms can be treated with the SMINffBB; mixed nonlinear terms whose continuous relaxation is twicedifferentiable are handled by the GMINffBB. While both algorithms use the concept of a branch & bound tree, they rely on fundamentally different bounding and branching strategies. In the GMINffBB algorithm, lower (upper) bounds at each node result from the solution of convex (nonconvex) MINLPs derived from the original problem. The construction of convex lower bound...
A Global Optimization Algorithm for Nonconvex Generalized Disjunctive Programming and Applications to Process Systems
 Computers and Chemical Engineering
, 2000
"... A global optimization algorithm for nonconvex Generalized Disjunctive Programming (GDP) problems is proposed in this paper. By making use of convex underestimating functions for bilinear, linear fractional and concave separable functions in the continuous variables, the convex hull of each nonlinear ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
A global optimization algorithm for nonconvex Generalized Disjunctive Programming (GDP) problems is proposed in this paper. By making use of convex underestimating functions for bilinear, linear fractional and concave separable functions in the continuous variables, the convex hull of each nonlinear disjunction is constructed. The relaxed convex GDP problem is then solved in the first level of a twolevel branch and bound algorithm, in which a discrete branch and bound search is performed on the disjunctions to predict lower bounds. In the second level, a spatial branch and bound method is used to solve nonconvex NLP problems for updating the upper bound. The proposed algorithm exploits the convex hull relaxation for the discrete search, and the fact that the spatial branch and bound is restricted to fixed discrete variables in order to predict tight lower bounds. Application of the proposed algorithm to several example problems is shown, as well as comparisons with other algorithms.
Computational Experience With The Molecular Distance Geometry Problem
"... In this work we consider the molecular distance geometry problem, which can be defined as the determination of the threedimensional structure of a molecule based on distances between some pairs of atoms. We address the problem as a nonconvex leastsquares problem. We apply three global optimization ..."
Abstract

Cited by 13 (12 self)
 Add to MetaCart
In this work we consider the molecular distance geometry problem, which can be defined as the determination of the threedimensional structure of a molecule based on distances between some pairs of atoms. We address the problem as a nonconvex leastsquares problem. We apply three global optimization algorithms (spatial BranchandBound, Variable Neighbourhood Search, Multi Level Single Linkage) to two sets of instances, one taken from the literature and the other new. Keywords: molecular conformation, distance geometry, global optimization, spatial BranchandBound, variable neighbourhood search, multi level single linkage.
Global optimization for the synthesis of integrated water systems in chemical processes
 Comp. Chem. Eng
"... In this paper, we address the problem of optimal synthesis of an integrated water system, where water using processes and water treatment operations are combined into a single network such that the total cost of obtaining freshwater for use in the water using operations, and treating wastewater is m ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
In this paper, we address the problem of optimal synthesis of an integrated water system, where water using processes and water treatment operations are combined into a single network such that the total cost of obtaining freshwater for use in the water using operations, and treating wastewater is minimized. A superstructure, which incorporates all feasible design alternatives for water treatment, reuse and recycle, is proposed. We formulate this structure as a nonconvex NonLinear Programming (NLP) problem, which is solved to global optimality. The problem takes the form of a nonconvex Generalized Disjunctive Program (GDP) if there is a flexibility of choosing different treatment technologies for the removal of the various contaminants in the wastewater streams. A new deterministic spatial branch and contract algorithm is proposed for optimizing such systems, in which piecewise under and overestimators are used to approximate the nonconvex terms in the original model to obtain a convex relaxation whose solution gives a lower bound on the global optimum. These lower bounds are made to converge to the solution within a branch and bound procedure. Several examples are presented to illustrate the optimization of these integrated networks using the proposed algorithm.
Global Optimization in Generalized Geometric Programming
 Engng
, 1997
"... A deterministic global optimization algorithm is proposed for locating the global minimum of generalized geometric (signomial) problems (GGP). By utilizing an exponential variable transformation the initial nonconvex problem (GGP) is reduced to a (DC) programming problem where both the constraints ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
A deterministic global optimization algorithm is proposed for locating the global minimum of generalized geometric (signomial) problems (GGP). By utilizing an exponential variable transformation the initial nonconvex problem (GGP) is reduced to a (DC) programming problem where both the constraints and the objective are decomposed into the difference of two convex functions. A convex relaxation of problem (DC) is then obtained based on the linear lower bounding of the concave parts of the objective function and constraints inside some box region. The proposed branch and bound type algorithm attains finite fflconvergence to the global minimum through the successive refinement of a convex relaxation of the feasible region and/or of the objective function and the subsequent solution of a series of nonlinear convex optimization problems. The efficiency of the proposed approach is enhanced by eliminating variables through monotonicity analysis, by maintaining tightly bound variables thro...