Results 1  10
of
10
A Global Optimization Algorithm (GOP) for Certain Classes of Nonconvex NLPs : II. Application of Theory and Test Problems
 Engng
, 1990
"... In Part I (Floudas and Visweswaran, 1990), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the rigorous solution of the problem through a series of primal and relaxed dual problems until th ..."
Abstract

Cited by 54 (21 self)
 Add to MetaCart
In Part I (Floudas and Visweswaran, 1990), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the rigorous solution of the problem through a series of primal and relaxed dual problems until the upper and lower bounds from these problems converged to an fflglobal optimum. In this paper, theoretical results are presented for several classes of mathematical programming problems that include : (i) the general quadratic programming problem, (ii) quadratic programming problems with quadratic constraints, (iii) pooling and blending problems, and (iv) unconstrained and constrained optimization problems with polynomial terms in the objective function and/or constraints. For each class, a few examples are presented illustrating the approach. Keywords : Global Optimization, Quadratic Programming, Quadratic Constraints, Polynomial functions, Pooling and Blending Problems. Author to whom...
A Global Optimization Method, αBB, for General TwiceDifferentiable Constrained NLPs: I  Theoretical Advances
, 1997
"... In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the constru ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the construction of a converging sequence of upper and lower bounds on the global minimum through the convex relaxation of the original problem. This relaxation is obtained by (i) replacing all nonconvex terms of special structure (i.e., bilinear, trilinear, fractional, fractional trilinear, univariate concave) with customized tight convex lower bounding functions and (ii) by utilizing some α parameters as defined by Maranas and Floudas (1994b) to generate valid convex underestimators for nonconvex terms of generic structure. In most cases, the calculation of appropriate values for the α parameters is a challenging task. A number of approaches are proposed, which rigorously generate a set of α par...
Quadratic Optimization
, 1995
"... . Quadratic optimization comprises one of the most important areas of nonlinear programming. Numerous problems in real world applications, including problems in planning and scheduling, economies of scale, and engineering design, and control are naturally expressed as quadratic problems. Moreover, t ..."
Abstract

Cited by 46 (3 self)
 Add to MetaCart
. Quadratic optimization comprises one of the most important areas of nonlinear programming. Numerous problems in real world applications, including problems in planning and scheduling, economies of scale, and engineering design, and control are naturally expressed as quadratic problems. Moreover, the quadratic problem is known to be NPhard, which makes this one of the most interesting and challenging class of optimization problems. In this chapter, we review various properties of the quadratic problem, and discuss different techniques for solving various classes of quadratic problems. Some of the more successful algorithms for solving the special cases of bound constrained and large scale quadratic problems are considered. Examples of various applications of quadratic programming are presented. A summary of the available computational results for the algorithms to solve the various classes of problems is presented. Key words: Quadratic optimization, bilinear programming, concave pro...
New Properties and Computational Improvement of the GOP Algorithm For Problems With Quadratic Objective Function and Constraints
 Journal of Global Optimization
, 1993
"... In Floudas and Visweswaran (1990, 1992), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the solution of the problem through a series of primal and relaxed dual problems that provide valid ..."
Abstract

Cited by 21 (10 self)
 Add to MetaCart
In Floudas and Visweswaran (1990, 1992), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the solution of the problem through a series of primal and relaxed dual problems that provide valid upper and lower bounds respectively on the global solution. The algorithm was proved to have finite convergence to an fflglobal optimum. In this paper, new theoretical properties are presented that help to enhance the computational performance of the GOP algorithm applied to problems of special structure. The effect of the new properties is illustrated through application of the GOP algorithm to a difficult indefinite quadratic problem, a multiperiod tankage quality problem that occurs frequently in the modeling of refinery processes, and a set of pooling/blending problems from the literature. In addition, extensive computational experience is reported for randomly generated concave and in...
A Global Optimization Method, αBB, for Process Design
 COMPUT. CHEM. ENG
, 1996
"... A global optimization algorithm, αBB, for twicedifferentiable NLPs is presented. It operates within a branchandbound framework and requires the construction of a convex lower bounding problem. A technique to generate such a valid convex underestimator for arbitrary twicedifferentiable functions ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A global optimization algorithm, αBB, for twicedifferentiable NLPs is presented. It operates within a branchandbound framework and requires the construction of a convex lower bounding problem. A technique to generate such a valid convex underestimator for arbitrary twicedifferentiable functions is described. The αBB has been applied to a variety of problems and a summary of the results obtained is provided.
Pooling problem: Alternate formulations and solution methods
 Manage. Sci
, 2004
"... doi 10.1287/mnsc.1030.0207 ..."
Computational Results For An Efficient Implementation Of The Gop Algorithm And Its Variants
"... Recently, Floudas and Visweswaran (1990, 1993) proposed a global optimization algorithm (GOP) for the solution of a large class of nonconvex problems through a series of primal and relaxed dual subproblems that provide upper and lower bounds on the global solution. Visweswaran and Floudas (1995a) pr ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Recently, Floudas and Visweswaran (1990, 1993) proposed a global optimization algorithm (GOP) for the solution of a large class of nonconvex problems through a series of primal and relaxed dual subproblems that provide upper and lower bounds on the global solution. Visweswaran and Floudas (1995a) proposed a reformulation of the algorithm in the framework of a branch and bound approach that allows for an easier implementation. They also proposed an implicit enumeration of all the nodes in the resulting branch and bound tree using a mixed integer linear (MILP) formulation, and a linear branching scheme that reduces the number of subproblems from exponential to linear. In this paper, a complete implementation of the new versions of the GOP algorithm, as well as detailed computational results of applying the algorithm to various classes of nonconvex optimization problems is presented. The problems considered including pooling and blending problems, problems with separation and heat exchang...
Stochastic Inventory Management for Tactical Process Planning under Uncertainties: MINLP Model and Algorithms
 AIChE Journal 2010, In press, DOI: 10.1002/aic.12338
"... We address in this paper the midterm planning of chemical complexes with integration of stochastic inventory management under supply and demand uncertainty. By using the guaranteed service approach to model the time delays in the chemical flows inside the chemical process network, we capture the st ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We address in this paper the midterm planning of chemical complexes with integration of stochastic inventory management under supply and demand uncertainty. By using the guaranteed service approach to model the time delays in the chemical flows inside the chemical process network, we capture the stochastic nature of the supply and demand variations, and develop an equivalent deterministic optimization model to minimize the total cost including production cost, feedstock purchase cost, cycle inventory and safety stock costs. The model simultaneously determines the optimal purchases of the feedstocks, production levels of the processes, sales of final products and safety stock levels of all the chemicals, as well as the internal demand of the production processes. The model also captures “riskpooling ” effects to allow centralization of inventory management for chemicals that are consumed/produced by multiple processes. We formulate the model as a mixedinteger nonlinear program (MINLP) with a nonconvex objective function and nonconvex constraints. To solve the global optimization problem with modest computational times, we exploit some model
Global Optimization of MixedInteger Quadratically Constrained Quadratic Programs (MIQCQP) through PiecewiseLinear and EdgeConcave Relaxations
, 2011
"... We propose a deterministic global optimization approach, whose novel contributions are rooted in the edgeconcave and piecewiselinear underestimators, to address nonconvex mixedinteger quadraticallyconstrained quadratic programs (MIQCQP) to εglobal optimality. The facets of lowdimensional (n ≤ ..."
Abstract
 Add to MetaCart
We propose a deterministic global optimization approach, whose novel contributions are rooted in the edgeconcave and piecewiselinear underestimators, to address nonconvex mixedinteger quadraticallyconstrained quadratic programs (MIQCQP) to εglobal optimality. The facets of lowdimensional (n ≤ 3) edgeconcave aggregations dominating the termwise relaxation of MIQCQP are introduced at every node of a branchandbound tree. Concave multivariable terms and sparsely distributed bilinear terms that do not participate in connected edgeconcave aggregations are addressed through piecewiselinear relaxations. Extensive computational studies are presented for point packing problems, standard and generalized pooling problems, and examples from GLOBALLib [55].
Separationnetwork synthesis: global optimum through rigorous superstructure
"... The available algorithmic methods often fail to yield with certainty the global optima in solving even a relatively simple class of separationnetwork synthesis problem for which the cost functions are considered to be linear. This is attributable to two complications; firstly the superstructures o ..."
Abstract
 Add to MetaCart
The available algorithmic methods often fail to yield with certainty the global optima in solving even a relatively simple class of separationnetwork synthesis problem for which the cost functions are considered to be linear. This is attributable to two complications; firstly the superstructures on which the solutions are based are incomplete; and the secondly, the mathematical programming models derived for the problems are unnecessarily cumbersome. To circumvent these complications, a novel method is proposed here to generate the complete superstructure and the corresponding mathematical programming model necessary for the separationnetwork synthesis problem with linear cost function. The efficacy of the proposed method is demonstrated by reexamining four published problems for which the optima obtained are claimed to be global. For all the problems reexamined, the costs of the solutions resulting from the present method are the same or as much as 30 % lower than those of the published