Results 1 - 10
of
80
GloptiPoly: Global Optimization over Polynomials with Matlab and SeDuMi
- ACM Trans. Math. Soft
, 2002
"... GloptiPoly is a Matlab/SeDuMi add-on to build and solve convex linear matrix inequality relaxations of the (generally non-convex) global optimization problem of minimizing a multivariable polynomial function subject to polynomial inequality, equality or integer constraints. It generates a series of ..."
Abstract
-
Cited by 141 (22 self)
- Add to MetaCart
(Show Context)
GloptiPoly is a Matlab/SeDuMi add-on to build and solve convex linear matrix inequality relaxations of the (generally non-convex) global optimization problem of minimizing a multivariable polynomial function subject to polynomial inequality, equality or integer constraints. It generates a series of lower bounds monotonically converging to the global optimum. Global optimality is detected and isolated optimal solutions are extracted automatically. Numerical experiments show that for most of the small- and medium-scale problems described in the literature, the global optimum is reached at low computational cost. 1
Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity
- SIAM JOURNAL ON OPTIMIZATION
, 2006
"... Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares ..."
Abstract
-
Cited by 122 (29 self)
- Add to MetaCart
(Show Context)
Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares (SOS) polynomials that lead to efficient SOS and semidefinite programming (SDP) relaxations are obtained. Numerical results from various test problems are included to show the improved performance of the SOS and SDP relaxations.
Convergent SDP-Relaxations in Polynomial Optimization with Sparsity
- SIAM Journal on Optimization
"... Abstract. We consider a polynomial programming problem P on a compact semi-algebraic set K ⊂ R n, described by m polynomial inequalities gj(X) ≥ 0, and with criterion f ∈ R[X]. We propose a hierarchy of semidefinite relaxations in the spirit those of Waki et al. [9]. In particular, the SDP-relaxati ..."
Abstract
-
Cited by 58 (16 self)
- Add to MetaCart
(Show Context)
Abstract. We consider a polynomial programming problem P on a compact semi-algebraic set K ⊂ R n, described by m polynomial inequalities gj(X) ≥ 0, and with criterion f ∈ R[X]. We propose a hierarchy of semidefinite relaxations in the spirit those of Waki et al. [9]. In particular, the SDP-relaxation of order r has the following two features: (a) The number of variables is O(κ 2r) where κ = max[κ1, κ2] witth κ1 (resp. κ2) being the maximum number of variables appearing the monomials of f (resp. appearing in a single constraint gj(X) ≥ 0). (b) The largest size of the LMI’s (Linear Matrix Inequalities) is O(κ r). This is to compare with the respective number of variables O(n 2r) and LMI size O(n r) in the original SDP-relaxations defined in [11]. Therefore, great computational savings are expected in case of sparsity in the data {gj, f}, i.e. when κ is small, a frequent case in practical applications of interest. The novelty with respect to [9] is that we prove convergence to the global optimum of P when the sparsity pattern satisfies a condition often encountered in large size problems of practical applications, and known as the running intersection property in graph theory. In such cases, and as a by-product, we also obtain a new representation result for polynomials positive on a basic closed semialgebraic set, a sparse version of Putinar’s Positivstellensatz [16]. 1.
Globally optimal estimates for geometric reconstruction problems
- In ICCV
, 2005
"... We introduce a framework for computing statistically optimal estimates of geometric reconstruction problems. While traditional algorithms often suffer from either local minima or non-optimality- or a combination of both- we pursue the goal of achieving global solutions of the statistically optimal c ..."
Abstract
-
Cited by 53 (15 self)
- Add to MetaCart
(Show Context)
We introduce a framework for computing statistically optimal estimates of geometric reconstruction problems. While traditional algorithms often suffer from either local minima or non-optimality- or a combination of both- we pursue the goal of achieving global solutions of the statistically optimal cost-function. Our approach is based on a hierarchy of convex relaxations to solve non-convex optimization problems with polynomials. These convex relaxations generate a monotone sequence of lower bounds and we show how one can detect whether the global optimum is attained at a given relaxation. The technique is applied to a number of classical vision problems: triangulation, camera pose, homography estimation and last, but not least, epipolar geometry estimation. Experimental validation on both synthetic and real data is provided. In practice, only a few relaxations are needed for attaining the global optimum. 1
Minimizing polynomials via sum of squares over the gradient ideal
- Math. Program
"... A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown ..."
Abstract
-
Cited by 51 (17 self)
- Add to MetaCart
(Show Context)
A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown to be SOS modulo its gradient ideal, provided the gradient ideal is radical or the polynomial is strictly positive on the gradient variety. This opens up the possibility of solving previously intractable polynomial optimization problems. The related problem of constrained minimization is also considered, and numerical examples are discussed. Experiments show that our method using the gradient variety outperforms prior SOS methods.
Revisiting Two Theorems of Curto and Fialkow on Moment Matrices
, 2004
"... We revisit two results of Curto and Fialkow on moment matrices. The first result asserts that every sequence... ..."
Abstract
-
Cited by 31 (4 self)
- Add to MetaCart
We revisit two results of Curto and Fialkow on moment matrices. The first result asserts that every sequence...
Semidefinite characterization and computation of zero-dimensional real radical ideals
, 2007
"... real radical ideals ..."
(Show Context)
Sum of squares methods for sensor network localization
, 2006
"... We formulate the sensor network localization problem as finding the global minimizer of a quartic polynomial. Then sum of squares (SOS) relaxations can be applied to solve it. However, the general SOS relaxations are too expensive to implement for large problems. Exploiting the special features of t ..."
Abstract
-
Cited by 28 (3 self)
- Add to MetaCart
(Show Context)
We formulate the sensor network localization problem as finding the global minimizer of a quartic polynomial. Then sum of squares (SOS) relaxations can be applied to solve it. However, the general SOS relaxations are too expensive to implement for large problems. Exploiting the special features of this polynomial, we propose a new structured SOS relaxation, and discuss its various properties. When distances are given exactly, this SOS relaxation often returns true sensor locations. At each step of interior point methods solving this SOS relaxation, the complexity is O(n 3), where n is the number of sensors. When the distances have small perturbations, we show that the sensor locations given by this SOS relaxation are accurate within a constant factor of the perturbation error under some technical assumptions. The performance of this SOS relaxation is tested on some randomly generated problems.