Results 1 
1 of
1
A Global Optimization Method, αBB, for General TwiceDifferentiable Constrained NLPs: I  Theoretical Advances
, 1997
"... In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the constru ..."
Abstract

Cited by 73 (4 self)
 Add to MetaCart
In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the construction of a converging sequence of upper and lower bounds on the global minimum through the convex relaxation of the original problem. This relaxation is obtained by (i) replacing all nonconvex terms of special structure (i.e., bilinear, trilinear, fractional, fractional trilinear, univariate concave) with customized tight convex lower bounding functions and (ii) by utilizing some α parameters as defined by Maranas and Floudas (1994b) to generate valid convex underestimators for nonconvex terms of generic structure. In most cases, the calculation of appropriate values for the α parameters is a challenging task. A number of approaches are proposed, which rigorously generate a set of α par...