Results 1  10
of
41
Sums of squares, moment matrices and optimization over polynomials
, 2008
"... We consider the problem of minimizing a polynomial over a semialgebraic set defined by polynomial equations and inequalities, which is NPhard in general. Hierarchies of semidefinite relaxations have been proposed in the literature, involving positive semidefinite moment matrices and the dual theory ..."
Abstract

Cited by 158 (10 self)
 Add to MetaCart
(Show Context)
We consider the problem of minimizing a polynomial over a semialgebraic set defined by polynomial equations and inequalities, which is NPhard in general. Hierarchies of semidefinite relaxations have been proposed in the literature, involving positive semidefinite moment matrices and the dual theory of sums of squares of polynomials. We present these hierarchies of approximations and their main properties: asymptotic/finite convergence, optimality certificate, and extraction of global optimum solutions. We review the mathematical tools underlying these properties, in particular, some sums of squares representation results for positive polynomials, some results about moment matrices (in particular, of Curto and Fialkow), and the algebraic eigenvalue method for solving zerodimensional systems of polynomial equations. We try whenever possible to provide detailed proofs and background.
A comparison of the SheraliAdams, LovászSchrijver and Lasserre relaxations for 01 programming
 Mathematics of Operations Research
, 2001
"... ..."
(Show Context)
Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity
 SIAM Journal on Optimization
, 2006
"... Abstract. Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of ..."
Abstract

Cited by 121 (30 self)
 Add to MetaCart
Abstract. Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares (SOS) polynomials that lead to efficient SOS and semidefinite programming (SDP) relaxations are obtained. Numerical results from various test problems are included to show the improved performance of the SOS and SDP relaxations. Key words.
Solving LargeScale Sparse Semidefinite Programs for Combinatorial Optimization
 SIAM JOURNAL ON OPTIMIZATION
, 1998
"... We present a dualscaling interiorpoint algorithm and show how it exploits the structure and sparsity of some large scale problems. We solve the positive semidefinite relaxation of combinatorial and quadratic optimization problems subject to boolean constraints. We report the first computational re ..."
Abstract

Cited by 119 (11 self)
 Add to MetaCart
We present a dualscaling interiorpoint algorithm and show how it exploits the structure and sparsity of some large scale problems. We solve the positive semidefinite relaxation of combinatorial and quadratic optimization problems subject to boolean constraints. We report the first computational results of interiorpoint algorithms for approximating the maximum cut semidefinite programs with dimension upto 3000.
Complete search in continuous global optimization and constraint satisfaction
 ACTA NUMERICA 13
, 2004
"... ..."
(Show Context)
Approximating Quadratic Programming With Bound Constraints
 Mathematical Programming
, 1997
"... We consider the problem of approximating the global maximum of a quadratic program (QP) with n variables subject to bound constraints. Based on the results of Goemans and Williamson [4] and Nesterov [6], we show that a 4=7 approximate solution can be obtained in polynomial time. Key words. Quadratic ..."
Abstract

Cited by 79 (13 self)
 Add to MetaCart
We consider the problem of approximating the global maximum of a quadratic program (QP) with n variables subject to bound constraints. Based on the results of Goemans and Williamson [4] and Nesterov [6], we show that a 4=7 approximate solution can be obtained in polynomial time. Key words. Quadratic programming, global maximizer, approximation algorithm This author is supported in part by NSF grant DMI9522507. 1 Introduction Consider the quadratic programming (QP) problem ¯ q(Q) := Maximize q(x) := x T Qx (QP) Subject to \Gammae x e; where Q 2 ! n\Thetan is given and e 2 ! n is the vector of all ones. Let ¯ x = ¯ x(Q) be a maximizer of the problem. In this paper, without loss of generality, we assume that ¯ x 6= 0. Normally, there is a linear term in the objective function: q(x) = x T Qx + c T x. However, the problem can be homogenized as Maximize q(x) := x T Qx + tc T x Subject to \Gammae x e; \Gamma1 t 1 by adding a scalar variable t. There always is an opti...
A survey of the Slemma
 SIAM Review
"... Abstract. In this survey we review the many faces of the Slemma, a result about the correctness of the Sprocedure. The basic idea of this widely used method came from control theory but it has important consequences in quadratic and semidefinite optimization, convex geometry, and linear algebra as ..."
Abstract

Cited by 59 (1 self)
 Add to MetaCart
Abstract. In this survey we review the many faces of the Slemma, a result about the correctness of the Sprocedure. The basic idea of this widely used method came from control theory but it has important consequences in quadratic and semidefinite optimization, convex geometry, and linear algebra as well. These were all active research areas, but as there was little interaction between researchers in these different areas, their results remained mainly isolated. Here we give a unified analysis of the theory by providing three different proofs for the Slemma and revealing hidden connections with various areas of mathematics. We prove some new duality results and present applications from control theory, error estimation, and computational geometry. Key words. Slemma, Sprocedure, control theory, nonconvex theorem of alternatives, numerical range, relaxation theory, semidefinite optimization, generalized convexities
On Lagrangian relaxation of quadratic matrix constraints
 SIAM J. Matrix Anal. Appl
, 2000
"... Abstract. Quadratically constrained quadratic programs (QQPs) play an important modeling role for many diverse problems. These problems are in general NP hard and numerically intractable. Lagrangian relaxations often provide good approximate solutions to these hard problems. Such relaxations are equ ..."
Abstract

Cited by 52 (18 self)
 Add to MetaCart
(Show Context)
Abstract. Quadratically constrained quadratic programs (QQPs) play an important modeling role for many diverse problems. These problems are in general NP hard and numerically intractable. Lagrangian relaxations often provide good approximate solutions to these hard problems. Such relaxations are equivalent to semidefinite programming relaxations. For several special cases of QQP, e.g., convex programs and trust region subproblems, the Lagrangian relaxation provides the exact optimal value, i.e., there is a zero duality gap. However, this is not true for the general QQP, or even the QQP with two convex constraints, but a nonconvex objective. In this paper we consider a certain QQP where the quadratic constraints correspond to the matrix orthogonality condition XXT = I. For this problem we show that the Lagrangian dual based on relaxing the constraints XXT = I and the seemingly redundant constraints XT X = I has a zero duality gap. This result has natural applications to quadratic assignment and graph partitioning problems, as well as the problem of minimizing the weighted sum of the largest eigenvalues of a matrix. We also show that the technique of relaxing quadratic matrix constraints can be used to obtain a strengthened semidefinite relaxation for the maxcut problem. Key words. Lagrangian relaxations, quadratically constrained quadratic programs, semidefinite programming, quadratic assignment, graph partitioning, maxcut problems
Approximation Algorithms for Quadratic Programming
, 1998
"... We consider the problem of approximating the global minimum of a general quadratic program (QP) with n variables subject to m ellipsoidal constraints. For m = 1, we rigorously show that an fflminimizer, where error ffl 2 (0; 1), can be obtained in polynomial time, meaning that the number of arithme ..."
Abstract

Cited by 32 (5 self)
 Add to MetaCart
We consider the problem of approximating the global minimum of a general quadratic program (QP) with n variables subject to m ellipsoidal constraints. For m = 1, we rigorously show that an fflminimizer, where error ffl 2 (0; 1), can be obtained in polynomial time, meaning that the number of arithmetic operations is a polynomial in n, m, and log(1=ffl). For m 2, we present a polynomialtime (1 \Gamma 1 m 2 )approximation algorithm as well as a semidefinite programming relaxation for this problem. In addition, we present approximation algorithms for solving QP under the box constraints and the assignment polytope constraints. Key words. Quadratic programming, global minimizer, polynomialtime approximation algorithm The work of the first author was supported by the Australian Research Council; the second author was supported in part by the Department of Management Sciences of the University of Iowa where he performed this research during a research leave, and by the Natural Scien...