Results 1  10
of
22
Optimization of polynomials on compact semialgebraic sets
 SIAM J. OPTIM
"... A basic closed semialgebraic subset S of R n is defined by simultaneous polynomial inequalities g1 ≥ 0,..., gm ≥ 0. We give a short introduction to Lasserre’s method for minimizing a polynomial f on a compact set S of this kind. It consists of successively solving tighter and tighter convex relaxat ..."
Abstract

Cited by 57 (4 self)
 Add to MetaCart
(Show Context)
A basic closed semialgebraic subset S of R n is defined by simultaneous polynomial inequalities g1 ≥ 0,..., gm ≥ 0. We give a short introduction to Lasserre’s method for minimizing a polynomial f on a compact set S of this kind. It consists of successively solving tighter and tighter convex relaxations of this problem which can be formulated as semidefinite programs. We give a new short proof for the convergence of the optimal values of these relaxations to the infimum f ∗ of f on S which is constructive and elementary. In the case where f possesses a unique minimizer x ∗ , we prove that every sequence of “nearly ” optimal solutions of the successive relaxations gives rise to a sequence of points in R n converging to x ∗.
Minimizing polynomials via sum of squares over the gradient ideal
 Math. Program
"... A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown ..."
Abstract

Cited by 51 (17 self)
 Add to MetaCart
A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown to be SOS modulo its gradient ideal, provided the gradient ideal is radical or the polynomial is strictly positive on the gradient variety. This opens up the possibility of solving previously intractable polynomial optimization problems. The related problem of constrained minimization is also considered, and numerical examples are discussed. Experiments show that our method using the gradient variety outperforms prior SOS methods.
Semidefinite Representations for Finite Varieties
 MATHEMATICAL PROGRAMMING
, 2002
"... We consider the problem of minimizing a polynomial over a semialgebraic set defined by polynomial equalities and inequalities. When the polynomial equalities have a finite number of complex solutions and define a radical ideal we can reformulate this problem as a semidefinite programming prob ..."
Abstract

Cited by 51 (7 self)
 Add to MetaCart
We consider the problem of minimizing a polynomial over a semialgebraic set defined by polynomial equalities and inequalities. When the polynomial equalities have a finite number of complex solutions and define a radical ideal we can reformulate this problem as a semidefinite programming problem. This semidefinite program involves combinatorial moment matrices, which are matrices indexed by a basis of the quotient vector space R[x 1 , . . . , x n ]/I. Our arguments are elementary and extend known facts for the grid case including 0/1 and polynomial programming. They also relate to known algebraic tools for solving polynomial systems of equations with finitely many complex solutions. Semidefinite approximations can be constructed by considering truncated combinatorial moment matrices; rank conditions are given (in a grid case) that ensure that the approximation solves the original problem at optimality.
On the complexity of Putinar’s Positivstellensatz
, 2008
"... Let S = {x ∈ R n  g1(x) ≥ 0,..., gm(x) ≥ 0} be a basic closed semialgebraic set defined by real polynomials gi. Putinar’s Positivstellensatz says that, under a certain condition stronger than compactness of S, every real polynomial f positive on S posesses a representation f = ∑ m i=0 σigi wher ..."
Abstract

Cited by 39 (8 self)
 Add to MetaCart
Let S = {x ∈ R n  g1(x) ≥ 0,..., gm(x) ≥ 0} be a basic closed semialgebraic set defined by real polynomials gi. Putinar’s Positivstellensatz says that, under a certain condition stronger than compactness of S, every real polynomial f positive on S posesses a representation f = ∑ m i=0 σigi where g0: = 1 and each σi is a sum of squares of polynomials. Such a representation is a certificate for the nonnegativity of f on S. We give a bound on the degrees of the terms σigi in this representation which depends on the description of S, the degree of f and a measure of how close f is to having a zero on S. As a consequence, we get information about the convergence rate of Lasserre’s procedure for optimization of a polynomial subject to polynomial constraints.
Global optimization of polynomials using gradient tentacles and sums of squares
 SIAM Journal on Optimization
"... We consider the problem of computing the global infimum of a real polynomial f on R n. Every global minimizer of f lies on its gradient variety, i.e., the algebraic subset of R n where the gradient of f vanishes. If f attains a minimum on R n, it is therefore equivalent to look for the greatest low ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
We consider the problem of computing the global infimum of a real polynomial f on R n. Every global minimizer of f lies on its gradient variety, i.e., the algebraic subset of R n where the gradient of f vanishes. If f attains a minimum on R n, it is therefore equivalent to look for the greatest lower bound of f on its gradient variety. Nie, Demmel and Sturmfels proved recently a theorem about the existence of sums of squares certificates for such lower bounds. Based on these certificates, they find arbitrarily tight relaxations of the original problem that can be formulated as semidefinite programs and thus be solved efficiently. We deal here with the more general case when f is bounded from below but does not necessarily attain a minimum. In this case, the method of Nie, Demmel and Sturmfels might yield completely wrong results. In order to overcome this problem, we replace the gradient variety by larger semialgebraic subsets of R n which we call gradient tentacles. It now gets substantially harder to prove the existence of the necessary sums of squares certificates.
Representation of nonnegative polynomials, degree bounds and applications to optimization
, 2006
"... Natural sufficient conditions for a polynomial to have a local minimum at a point are considered. These conditions tend to hold with probability 1. It is shown that polynomials satisfying these conditions at each minimum point have nice presentations in terms of sums of squares. Applications are giv ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
Natural sufficient conditions for a polynomial to have a local minimum at a point are considered. These conditions tend to hold with probability 1. It is shown that polynomials satisfying these conditions at each minimum point have nice presentations in terms of sums of squares. Applications are given to optimization on a compact set and also to global optimization. In many cases, there are degree bounds for such presentations. These bounds are of theoretical interest, but they appear to be too large to be of much practical use at present. In the final section, other more concrete degree bounds are obtained which ensure at least that the feasible set of solutions is not empty.
Representations of positive polynomials on noncompact semialgebraic sets via KKT ideals
, 2006
"... This paper studies the representation of a positive polynomial f(x) on a noncompact semialgebraic set S = {x ∈ R n: g1(x) ≥ 0, · · · , gs(x) ≥ 0} modulo its KKT (KarushKuhnTucker) ideal. Under the assumption that the minimum value of f(x) on S is attained at some KKT point, we show that f(x) ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
This paper studies the representation of a positive polynomial f(x) on a noncompact semialgebraic set S = {x ∈ R n: g1(x) ≥ 0, · · · , gs(x) ≥ 0} modulo its KKT (KarushKuhnTucker) ideal. Under the assumption that the minimum value of f(x) on S is attained at some KKT point, we show that f(x) can be represented as sum of squares (SOS) of polynomials modulo the KKT ideal if f(x)> 0 on S; furthermore, when the KKT ideal is radical, we have that f(x) can be represented as sum of squares (SOS) of polynomials modulo the KKT ideal if f(x) ≥ 0 on S. This is a generalization of results in [18], which discuss the SOS representations of nonnegative polynomials over gradient ideals. Key words: Polynomials, semialgebraic set, sum of squares (SOS), KarushKuhnTucker (KKT) system, KKT ideal. 1
Exposed faces of semidefinite representable sets
"... Abstract. A linear matrix inequality (LMI) is a condition stating that a symmetric matrix whose entries are affine linear combinations of variables is positive semidefinite. Motivated by the fact that diagonal LMIs define polyhedra, the solution set of an LMI is called a spectrahedron. Linear images ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
(Show Context)
Abstract. A linear matrix inequality (LMI) is a condition stating that a symmetric matrix whose entries are affine linear combinations of variables is positive semidefinite. Motivated by the fact that diagonal LMIs define polyhedra, the solution set of an LMI is called a spectrahedron. Linear images of spectrahedra are called semidefinite representable sets. Part of the interest in spectrahedra and semidefinite representable sets arises from the fact that one can efficiently optimize linear functions on them by semidefinite programming, like one can do on polyhedra by linear programming. It is known that every face of a spectrahedron is exposed. This is also true in the general context of rigidly convex sets. We study the same question for semidefinite representable sets. Lasserre proposed a moment matrix method to construct semidefinite representations for certain sets. Our main result is that this method can only work if all faces of the considered set are exposed. This necessary condition complements sufficient conditions recently proved by Lasserre, Helton and Nie.
Lower bounds for polynomials using geometric programming
"... We make use of a result of Hurwitz and Reznick [8] [19], and a consequence of this result due to Fidalgo and Kovacec [5], to determine a new sufficient condition for a polynomial f ∈ R[X1,..., Xn] of even degree to be a sum of squares. This result generalizes a result of Lasserre in [10] and a res ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
We make use of a result of Hurwitz and Reznick [8] [19], and a consequence of this result due to Fidalgo and Kovacec [5], to determine a new sufficient condition for a polynomial f ∈ R[X1,..., Xn] of even degree to be a sum of squares. This result generalizes a result of Lasserre in [10] and a result of Fidalgo and Kovacec in [5], and it also generalizes the improvements of these results given in [6]. We apply this result to obtain a new lower bound fgp for f, and we explain how fgp can be computed using geometric programming. The lower bound fgp is generally not as good as the lower bound fsos introduced by Lasserre [11] and Parrilo and Sturmfels [15], which is computed using semidefinite programming, but a run time comparison shows that, in practice, the computation of fgp is much faster. The computation is simplest when the highest degree term of f has the form ∑ n i=1 aiX 2d i, ai> 0, i = 1,..., n. The lower bounds for f established in [6] are obtained by evaluating the objective function of the geometric program at the appropriate feasible points.