Results 1  10
of
59
Cones Of Matrices And Successive Convex Relaxations Of Nonconvex Sets
, 2000
"... . Let F be a compact subset of the ndimensional Euclidean space R n represented by (finitely or infinitely many) quadratic inequalities. We propose two methods, one based on successive semidefinite programming (SDP) relaxations and the other on successive linear programming (LP) relaxations. Each ..."
Abstract

Cited by 51 (19 self)
 Add to MetaCart
(Show Context)
. Let F be a compact subset of the ndimensional Euclidean space R n represented by (finitely or infinitely many) quadratic inequalities. We propose two methods, one based on successive semidefinite programming (SDP) relaxations and the other on successive linear programming (LP) relaxations. Each of our methods generates a sequence of compact convex subsets C k (k = 1, 2, . . . ) of R n such that (a) the convex hull of F # C k+1 # C k (monotonicity), (b) # # k=1 C k = the convex hull of F (asymptotic convergence). Our methods are extensions of the corresponding LovaszSchrijver liftandproject procedures with the use of SDP or LP relaxation applied to general quadratic optimization problems (QOPs) with infinitely many quadratic inequality constraints. Utilizing descriptions of sets based on cones of matrices and their duals, we establish the exact equivalence of the SDP relaxation and the semiinfinite convex QOP relaxation proposed originally by Fujie and Kojima. Using th...
Computable representations for convex hulls of lowdimensional quadratic forms
, 2007
"... Let C be the convex hull of points { ( 1) ( 1 x x)T  x ∈ F ⊂ ℜ n}. Representing or approximating C is a fundamental problem for global optimization algorithms based on convex relaxations of products of variables. If n ≤ 4 and F is a simplex then C has a computable representation in terms of matric ..."
Abstract

Cited by 31 (11 self)
 Add to MetaCart
(Show Context)
Let C be the convex hull of points { ( 1) ( 1 x x)T  x ∈ F ⊂ ℜ n}. Representing or approximating C is a fundamental problem for global optimization algorithms based on convex relaxations of products of variables. If n ≤ 4 and F is a simplex then C has a computable representation in terms of matrices X that are doubly nonnegative (positive semidefinite and componentwise nonnegative). If n = 2 and F is a box, then C has a representation that combines semidefiniteness with constraints on product terms obtained from the reformulationlinearization technique (RLT). The simplex result generalizes known representations for the convex hull of {(x1, x2, x1x2)  x ∈ F} when F ⊂ ℜ 2 is a triangle, while the result for box constraints generalizes the wellknown fact that in this case the RLT constraints generate the convex hull of {(x1, x2, x1x2)  x ∈ F}. When n = 3 and F is a box, a representation for C can be obtained by utilizing the simplex result for n = 4 in conjunction with a triangulation of the 3cube.
Reformulations in Mathematical Programming: A Computational Approach
"... Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathema ..."
Abstract

Cited by 24 (19 self)
 Add to MetaCart
(Show Context)
Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathematical expressions of the parameters and decision variables, and therefore excludes optimization of blackbox functions. A reformulation of a mathematical program P is a mathematical program Q obtained from P via symbolic transformations applied to the sets of variables, objectives and constraints. We present a survey of existing reformulations interpreted along these lines, some example applications, and describe the implementation of a software framework for reformulation and optimization. 1
REFORMULATIONS IN MATHEMATICAL PROGRAMMING: DEFINITIONS AND SYSTEMATICS
, 2008
"... A reformulation of a mathematical program is a formulation which shares some properties with, but is in some sense better than, the original program. Reformulations are important with respect to the choice and efficiency of the solution algorithms; furthermore, it is desirable that reformulations c ..."
Abstract

Cited by 23 (17 self)
 Add to MetaCart
A reformulation of a mathematical program is a formulation which shares some properties with, but is in some sense better than, the original program. Reformulations are important with respect to the choice and efficiency of the solution algorithms; furthermore, it is desirable that reformulations can be carried out automatically. Reformulation techniques are very common in mathematical programming but interestingly they have never been studied under a common framework. This paper attempts to move some steps in this direction. We define a framework for storing and manipulating mathematical programming formulations, give several fundamental definitions categorizing reformulations in essentially four types (optreformulations, narrowings, relaxations and approximations). We establish some theoretical results and give reformulation examples for each type.
An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms
 Journal of Global Optimization
, 2005
"... Many nonconvex nonlinear programming (NLP) problems of practical interest involve bilinear terms and linear constraints, as well as, potentially, other convex and nonconvex terms and constraints. In such cases, it may be possible to augment the formulation with additional linear constraints (a subse ..."
Abstract

Cited by 23 (11 self)
 Add to MetaCart
(Show Context)
Many nonconvex nonlinear programming (NLP) problems of practical interest involve bilinear terms and linear constraints, as well as, potentially, other convex and nonconvex terms and constraints. In such cases, it may be possible to augment the formulation with additional linear constraints (a subset of ReformulationLinearization Technique constraints) which do not a#ect the feasible region of the original NLP but tighten that of its convex relaxation to the extent that some bilinear terms may be dropped from the problem formulation. We present an e#cient graphtheoretical algorithm for e#ecting such exact reformulations of large, sparse NLPs. The global solution of the reformulated problem using spatial Branchand Bound algorithms is usually significantly faster than that of the original NLP. We illustrate this point by applying our algorithm to a set of pooling and blending global optimization problems.
Convex envelopes of multilinear functions over a unit hypercube and over special discrete sets
 ACTA MATHEMATICA VIETNAMICA
, 1997
"... In this paper, we present some general as well as explicit characterizations of the convex envelope of multilinear functions defined over a unit hypercube. A new approach is used to derive this characterization via a related convex hull representation obtained by applying the ReformulationLineariz ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
(Show Context)
In this paper, we present some general as well as explicit characterizations of the convex envelope of multilinear functions defined over a unit hypercube. A new approach is used to derive this characterization via a related convex hull representation obtained by applying the ReformulationLinearization Technique (RLT) of Sherali and Adams (1990, 1994). For the special cases of multilinear functions having coefficients that are either all +1 or all −1, we develop explicit formulae for the corresponding convex envelopes. Extensions of these results are given for the case when the multilinear function is defined over discrete sets, including explicit formulae for the foregoing special cases when this discrete set is represented by generalized upper bounding (GUB) constraints in binary variables. For more general cases of multilinear functions, we also discuss how this construct can be used to generate suitable relaxations for solving nonconvex optimization problems that include such structures.
Consensus Set Maximization with Guaranteed Global Optimality for Robust Geometry Estimation
"... Finding the largest consensus set is one of the key ideas used by the original RANSAC for removing outliers in robustestimation. However, because of its random and nondeterministic nature, RANSAC does not fulfill the goal of consensus set maximization exactly and optimally. Based on global optimiz ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
(Show Context)
Finding the largest consensus set is one of the key ideas used by the original RANSAC for removing outliers in robustestimation. However, because of its random and nondeterministic nature, RANSAC does not fulfill the goal of consensus set maximization exactly and optimally. Based on global optimization, this paper presents a new algorithm that solves the problem exactly. We reformulate the problem as a mixed integer programming (MIP), and solve it via a tailored branchandbound method, where the bounds are computed from the MIP’s convex underestimators. By exploiting the special structure of linear robustestimation, the new algorithm is also made efficient from a computational point of view. 1.
Global optimization algorithm for heat exchanger networks
 Ind. Eng. Chem. Res
, 1993
"... A global optimization algorithm for heat exchanger networks ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
A global optimization algorithm for heat exchanger networks
Pooling problem: Alternate formulations and solution methods
 Manage. Sci
, 2004
"... doi 10.1287/mnsc.1030.0207 ..."
(Show Context)
Towards Implementations of Successive Convex Relaxation Methods for Nonconvex Quadratic Optimization Problems
, 1999
"... Recently Kojima and Tuncel proposed new successive convex relaxation methods and their localizeddiscretized variants for general nonconvex quadratic optimization problems. Although an upper bound of the optimal objective function value within a previously given precision can be found theoretically ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
(Show Context)
Recently Kojima and Tuncel proposed new successive convex relaxation methods and their localizeddiscretized variants for general nonconvex quadratic optimization problems. Although an upper bound of the optimal objective function value within a previously given precision can be found theoretically by solving a finite number of linear programs, several important implementation issues remain unsolved. In this paper, we discuss those issues, present practically implementable algorithms and report numerical results.