## A Global Optimization Method, αBB, for General Twice-Differentiable Constrained NLPs: I - Theoretical Advances (1997)

### Cached

### Download Links

Citations: | 56 - 4 self |

### BibTeX

@ARTICLE{Adjiman97aglobal,

author = {C. S. Adjiman and S. Dallwig and C. A. Floudas and A. Neumaier},

title = {A Global Optimization Method, αBB, for General Twice-Differentiable Constrained NLPs: I - Theoretical Advances},

journal = {},

year = {1997},

volume = {22}

}

### Years of Citing Articles

### OpenURL

### Abstract

In this paper, the deterministic global optimization algorithm, αBB, (α-based Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twice-differentiable NLPs. The key idea is the construction of a converging sequence of upper and lower bounds on the global minimum through the convex relaxation of the original problem. This relaxation is obtained by (i) replacing all nonconvex terms of special structure (i.e., bilinear, trilinear, fractional, fractional trilinear, univariate concave) with customized tight convex lower bounding functions and (ii) by utilizing some α parameters as defined by Maranas and Floudas (1994b) to generate valid convex underestimators for nonconvex terms of generic structure. In most cases, the calculation of appropriate values for the α parameters is a challenging task. A number of approaches are proposed, which rigorously generate a set of α par...